HP calculators, back in the day. I've heard stories (perhaps apocryphal) of one that was run over by a car and still worked. Even if that one story isn't true, they were solid.
All traffic lights but especially the ones set to do rolling green lights. The rolling green lights assume someone’s going the speed limit (which no one does) so it introduces more stop-and-go than if they all switched at the same time! But traffic lights on their own are all optimized for the peak 20 min of traffic instead of the other 24 hours. They really don’t make sense outside of peak.
We visited some small town when I was a kid. They had rolling green lights. Their signs didn't say "speed limit 25". Instead they said "lights are synchronized for 25". Once you figured out that they were telling the truth, you drove 25.
I'd be highly surprised if rolling green lights are implemented the exact same way across geographies. Does anyone have more details on how this works ? It has been an area of curiosity for me for quite a long time.
An IT ticketing system that had a disaster recovery solution inclusive of DB failover, load-balancing, etc. all to another data center in another time zone.
Some medium critical business systems didn't have that so if a disaster struck, that stupid ticketing system would fail over and be up. I guess so they could put tickets in for all the other systems that didn't make it?
I think this was borne out of one of those "use it or lose it" budget scenarios that happens in big companies at the end of a fiscal year.
My personal and not-in-the-weeds take is that k8s had some base assumptions that turned out incorrect and things become more convoluted because of it. Originally, state was not included with K8s because state is hard and is antithetical to their goals. Then enough users ended up convincing the project that state was here to stay and then complexity kept growing. People realized they need variables for values like port number to be shared across k8s files and so templating was introduced. But that wasn’t good enough and so another layer of abstraction is added, then another layer, etc. As more surface area is introduced, they didn’t want to appear to be making too much churn, so instead of lots of major and minor version bumps, they introduced beta features that they could break without feeling like they broke backwards compatibility even though it still breaks integrations that use beta features (gRPC package in Go did this too). In summary, I don’t think it is over engineered per se, I think it will be a stepping stone onto what ever is next that solves for these kinds of things in its original design instructions of being tacked on.
My lecturers used to talk about the over engineering of chemical plants in the USSR. It was something to do with the command economy that meant the engineers over engineered the plants
US government, (might apply to others). They do so much paperwork and analysis to ensure no nickel is misspent. The cost to protect a nickel seems to be about a dollar.
> They do so much paperwork and analysis to ensure no nickel is misspent.
Lol. They aren't making sure a nickel is misspent, they are making sure all of their allotted money is spent. If a government organization doesn't spend all of their money one fiscal year, they won't get as much the next year. In government organizations, growing their budgets is their main goal, it seems.
https://www.youtube.com/watch?v=_Cp-BGQfpHQ