An unfortunate side effect is that this perceived badness of the code is then often used as an excuse to not really try and understand it at all ("let's rewrite!" or "we don't touch that anymore").
As a junior coder earlier in my career I found exactly the situation described. A co-worker was always griping loudly about how bad the code and architectural decisions of certain people supposedly were - all except his own, of course.
At first I was too diffident to form an opinion and considered whether he might be right. When I gained more experience, and finally got a look at what he was ranting about, it seemed to me that this guy's targets were not so bad and that his own output was hardly more impressive.
Eventually I saw other examples of this pattern and realized something: those who bluster in this way tend to be middle-range developers who are trying to cover up their own insecurities - while those who are really superior tend to be quieter, more businesslike and basically getting things done, and well, while those described above are going on in their "BS" sessions.
I don't claim to qualify for the latter group to the degree I would like, but when I hear the denigration, I ask what are the specific problems and what do we propose to do about them. And I try to promote a positive atmosphere, and really learn the code rather than just looking for where to put more duct tape.
yeah, if you're going to talk bad about anyone's code, start with your own. not even the things you wrote a few months ago, but like even today, writing xml schemas for a rest api, the whole time i'm thinking about all the ways the decisions i'm making are going to come back and bite me in the ass. not because i'm inexperienced but because i've written and worked with schemas and specs for interprocess communication before and know that any changes to the schema will require edits to multiple codebases and the time and effort to make some of those changes can be frightening.
i guess this is what fred brooks calls "the second system effect." it'll probably turn out that there's some other more terrible problem with the current system that i'm not even seeing because i'm overly concerned about the communication protocol on top and not the databases underneath or concurrency off to the side somewhere.
how many systems of some particular type does one have to build before not writing shit on the first pass? because, like, yes, the code freely available to anyone with a quick `apt-get source` is often of much higher quality than some things hidden in multinational corporations' private repos. so everyone can have some idea of what good code looks like now, but i am personally finding creating much more difficult than verifying and fear that it's going to take a lot of time to narrow that gap.
is it really necessary to work on like 8 processors before doing what someone like ivan godard does? maybe just 4 or 5 projects (which translates into about a decade, mind you) is enough before the "i'm fucking shit up" jitters pass for any particular type of system?