Aside from legal reasons, much of this in my opinion is the traditional power dynamic between employer/employee, where the hiring side typically does a lot of due diligence and research into people being hired, and has the upper hand in the negotiation, much more than the other direction.
At least conceptually, I believe developers need to stand on equal ground, to judge and weigh their candidates (employers and clients) in the same way, including, like you say, assessing their code quality before making any commitment, to see if it's up to your standards.
There's always a "discovery phase" before hiring/being hired, where the two sides are exploring what the other has to offer, to determine whether they can be trusted with the work. You can usually poke around enough to discover, for example, what stacks their main product is using, the general level of skills and standards in UX, performance, security, etc.
Some, maybe at increasingly more places, they'd have public or open-source repositories that give a fair insight into the engineering culture. This could be similar to how developers have (or are expected to have) GitHub profiles. I'd take that as a healthy sign of developer-led best practices and transparency in the company.
I'd also like to add that the majority of codebases in the wild are bloated, complex spaghetti monoliths of questionable quality. It's just a reflection of limited time, budget, average skill sets, and the real world being practically chaotic. Knowing how to consistently improve the situation is part of being a valuable developer/team.