> Is virtually every single company and developer wrong?
Virtually every company and developer opts for simpler development and worse runtimes. Is that "wrong"? I'm not sure how to answer that. You could probably make a case either way.
Exactly, we have mediocre software for the same reason that we have chinesium for physical products - because everyone opts for the cheapest option as long as it is kind of good enough. Makes sense for the individuals involved perhaps, but is a giant loss for society as a whole.
The average consumer running electron apps on a MacBook is using far less power than the average HN user with a 1000w home lab made out of salvaged obsolete parts.
Not to mention half the users here rally against EVs, public transport, and probably own an F150. The environment only matters when we are talking about a 1w savings with election vs qt.
> The average consumer running electron apps on a MacBook is using far less power than the average HN user with a 1000w home lab made out of salvaged obsolete parts.
Both of these would be using less power with more efficient software.
But no, lets excuse shitty practices that could save a lot (over the entire userbase) for relatively little effort because it doesn't solve everything.
Not to disagree, but the bigger factor for environmental gains is likely backend languages. If people could stop writing servers in languages that are literally 40 times slower (Python with Django, Ruby on Rails, etc.) in terms of the number of requests it can handle / throughput, that directly translates to more server instances behind a load balancer, which translates to 10 times the electricity usage.
That, imo, is totally unacceptable (also, financially not optimal).
I've had people tell me that the "developer productivity gains" of using Python totally justifies the circa 10 X electricity usage + 10 X hosting/cloud costs. (Yes, developers cost a lot, but the hypothesis that Python/other slow languages result in so much higher developer productivity is a questionable assertion -- especially, in contrast to languages like Kotlin, Go, etc.)
Are you sure that servers are really a bigger factor? Sure, it's easier to see how much power a data center consume but there are literally billions of client devices out there.
At least with backend software there is a financial incentive for efficient software. For client software there is no incentive unless there is actual competition, of which there is often effectively none as in the user cannot choose better software because interopability is not a thing.
Virtually every company and developer opts for simpler development and worse runtimes. Is that "wrong"? I'm not sure how to answer that. You could probably make a case either way.