More and more I'm noticing this narrative of "if I write awful code with obvious mistakes and somehow nobody else notices it in code review, it's never our fault, the language and its typing system or safety features should've stopped me" gaining popularity in the programming world and I really have to ask, what happened to programmers that actually knew what they were doing instead of expecting the computer to tell them what to do?
Because people are realising in general how good type systems can help at preventing such errors? The well was poisoned for a while with terrible type systems that provided either marginal safety or were just horrible to work with (e.g. complete lack of inference) but that is now seriously changing.
Enough of the "hard problems" are solved where it's expected that developers use off-the-shelf type solutions. As a result, they're not getting hands on experience solving hard problems, or challenging each other to think more deeply about solutions.
Software systems have been able to grow mostly because developers have been able to delegate a lot of diligence to tools.
Of course we could require a developer to "know what they are doing", many work environments do. However, you won't see many posts about that. First of all because it doesn't scale, and secondly because it doesn't make for interesting reading.
I learned never to use autoinc IDs in anything especially not URLs where they leak DB info, etc. However, I've seen it many times where younger devs do exactly that because they're learning from bad online books or tutorials. Every generation dev needs to relearn the same lessons.
I also suspect that many devs today are learning platforms first and software skills last. This was the reverse for many devs that came from building it the hard way and then using platform tooling to simplify. Newers devs are looking for the tooling to provide the core skills guardrails.