I am quite positive, that (despite this and countless others fun examples to the contrary) the average the UX floor has risen by a lot.
Sure, things regress and move in waves, but on the whole user design has been established as the primary of software development and that really was a not the case back when.
Take something like error handling in a form. In a lot of average software, it was not at all uncommon for a form to just say "Error" when something went wrong (or just not submit). Or lose all form input after unsuccessful submission. Programmers were unironically confused about why people would not just enter correct information. People then wrote books about how to design form errors. Now, basically every web framework includes at least some form of validation and error handling by default(-ish), and most people would be seriously confused if they saw something like the above.
If you find it easy to poke holes into this one, please consider the average across all the little things that go into not fucking up a form, which is still hard to get really good, but again I am describing something of an average expectation here.
I would pin this to two major developments:
1. Designers are increasingly everywhere. If you think "duh?", this is entirely not how software was made. Programmers, commanded by business people, made software.
2. Most programmers today are also designers, and I don't mean in the sense that they always were (designing the software), but as in "thinking about people using the product".
Again, this might feel like a comical thing to even say but in most places programmers were just not expected to do anything to make the users life simple, unless explicitly told so. That was the designers job. In fact, a lot of programmers considered it a holy duty to fight any feature that was merely a convenience, and were quite adamant that, surely, the user could simply be expected to suffer a little and work a bit harder to get things done, if that meant keeping the code base pristine.
I think your point 2 is absolutely on the nose here. It fits in with broader industry trends in testing and operations.
And perhaps that's where the OP's question originated from?
As we've watched the despecialization of our field in testing and ops, we've seen that things improve, as ideas are introduced more widely, while also seeing them get mimicked and cargo-culted when the ideas are diffused.
Maybe the coders who were fighting against testing mandates or devops or design thinking were just insecurely admitting to their own ignorance on these topics and asking for assistance in being able to perform their new duties effectively?
One value in specialists is the freedom that comes with specialization enables them to do their job more completely. Fred Brooks's surgical team could not be more relevant.
Mind you, I know this has probably never been the case looking for example at Apple, Google or other shops that worked in similar spirit, but as a mainstream phenomena you have to not look further than the late 90s or early 2000s to find that average programmers in mid tier companies harboured a mix of non-empathy, non-sympathy and user frustration over a complicated interface and a designers call to do something about it, was regularly met with arrogance, a sigh or a frown.
Of course, this can also be credited to the fact that ui design for software was at a much different place in general.
Sure, things regress and move in waves, but on the whole user design has been established as the primary of software development and that really was a not the case back when.
Take something like error handling in a form. In a lot of average software, it was not at all uncommon for a form to just say "Error" when something went wrong (or just not submit). Or lose all form input after unsuccessful submission. Programmers were unironically confused about why people would not just enter correct information. People then wrote books about how to design form errors. Now, basically every web framework includes at least some form of validation and error handling by default(-ish), and most people would be seriously confused if they saw something like the above.
If you find it easy to poke holes into this one, please consider the average across all the little things that go into not fucking up a form, which is still hard to get really good, but again I am describing something of an average expectation here.
I would pin this to two major developments:
1. Designers are increasingly everywhere. If you think "duh?", this is entirely not how software was made. Programmers, commanded by business people, made software.
2. Most programmers today are also designers, and I don't mean in the sense that they always were (designing the software), but as in "thinking about people using the product".
Again, this might feel like a comical thing to even say but in most places programmers were just not expected to do anything to make the users life simple, unless explicitly told so. That was the designers job. In fact, a lot of programmers considered it a holy duty to fight any feature that was merely a convenience, and were quite adamant that, surely, the user could simply be expected to suffer a little and work a bit harder to get things done, if that meant keeping the code base pristine.