Hacker News new | past | comments | ask | show | jobs | submit login

Considering how many hours a day he likely uses his machine, it may work out to something like the difference between a few cents and 10 cents an hour.



That's only going to be true if the $4000 extra they spent is used for 40,000+ hours. It's likely closer to be a dollar an hour more for slightly crisper text. (Assuming they use it full time for 2 years)


The monitor/GPU have a lot more than 2 year lifespan. Depreciation is likely less than 50% in 2 years.

And he didn't pay for slightly crisper text, he paid for higher productivity. At $100/hour that 25 to 50 cents is trivial to make up. Which is why devs should never, ever skimp on hardware.


Okay, first of all let me say that I don't want to poo-poo anyone's choice of environment. If it "sparks joy" for hours every day, it's probably a good personal investment.

On the other hand, it does require additional resources being wasted, both for producing the items -- monitors/GPUs indeed do have a long lifespan, which means that the previous ones would still work -- and increased energy used per hour. So saying that any $ spent on hardware can't be wasted for devs...

And whether it will actually result in increased productivity is a good question. Readability, eye strain, enjoyment all factor in. Plenty of studies that only focus on one aspect, so easy to cherry pick a conclusion...

Again, I don't believe this is more wasteful than a few spa days or a vacation, so good for the OP. My initial post was about how that works out for them (or others chiming in). I find myself not too affected by this, i.e. my last big jump was from a 21 CRT to a 24 inch Dell, especially when it comes to the simple shapes of monochrome fonts.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: