There used to be a time, in the dark dark ages of history, 10 years ago or so, when I would encounter issues during the course of my work, and I could fairly confidently assume I was doing something wrong, or I just hadn’t read the manual correctly.
Contrast that to now, when I regularly write code, or include a library to do a certain thing, and I find it just does not work. Following my historical logic, I spent a day trying to figure out what I did wrong, only to figure out that it was a bug, or edge case that someone just hadn’t covered (never mind that my life seems to consist of edge cases now) or the documentation is just plain out of date (or nonexistent).
Is this a trend? And does it have to do with the average code, or myself? Have you experienced something similar? It’s driving me nuts.
I want to rely on other code, but more and more I find that it’s safer to just assume it’ll be broken from the start.