I'm honestly not sure where all these hackernews commenters with low-dpi displays are coming from - my former employers equipped all the software engineers with dual-4K displays nearly a decade ago.
One is hard-put to buy a developer-power laptop with a sub-2K display these days, even in the Windows world, and >2K displays have been cheap on desktop for a really long time.
I believe there are a lot of people using 1080p monitors because they bought it a while ago and they're still working fine. There's also a lot of lower-end 1080p monitors still being sold today.
> One is hard-put to buy a developer-power laptop with a sub-2K display these days, even in the Windows world
I personally see a lot of 1080p screens on new gaming laptops too. Lots of people get those for work from what I see with my peers. When I sold my RTX 3060 laptop with a 1080p screen, most buyers wanted it for professional work, according to them.
> I'm honestly not sure where all these hackernews commenters with low-dpi displays are coming from
If anything, this is exactly the place where I'd expect a bunch of people to be rocking an older Thinkpad. :)
In part though that's not because all those users can't afford >1080p because some of them can it's that insanely high refresh rate monitors and esports players often use 1080p at >300Hz - even the ones without still use 1080p because driving up the frame rate drives down the input latency.
Whether it matters is a bigger issue, 30 to 60Hz I notice a huge difference, 60 to 144Hz@4K I can't tell but I'm old and don't play esports games.
I don't think this is contra to my original point. Nearly 50% of all users are running at greater-than 1080p resolutions, and presumably power users are overrepresented in the latter category (and certainly, it's not just the ~2.5% of Mac users pushing the average up)
FWIW, I didn't mean to reply to you in an argumentative way. Just proposing an answer to this:
> I'm honestly not sure where all these hackernews commenters with low-dpi displays are coming from
I still see 1080p fairly often on new setups/laptops, basically, although 1440p and 4K are becoming more common on higher-end desktops. Then again, 1440p at 27" or 32" isn't really high dpi.
If you have 20/20 vision, a 27" display at 1440p (~110 DPI) has a visual acuity distance of 79cm - ie, if you are sat 79cm or further away from the screen, humans are not capable of resolving any extra detail from a higher resolution. High refresh rate 1440p IPS screens are very widely available at good prices, so it isn't that crazy that people choose them.
Phone and laptop have higher DPI screens of course, but I'm not close enough to my desktop monitor for a higher DPI to matter.
That's a common misconception. That acuity calculation is based on knowing nothing about the image; imagine applying it to arbitrary noise. When you have things like lines and edges your eyes can pick out differences an order of magnitude finer.
It's always been a weird topic - science (appears to, due to the aforementioned misconception) say one thing, and yet I have eyes and see a difference that the science says I shouldn't see.
Have you tested it in practice? High-DPI monitors make a very noticeable difference for text and user interface. That's the truth, even if the theory doesn't agree.
One is hard-put to buy a developer-power laptop with a sub-2K display these days, even in the Windows world, and >2K displays have been cheap on desktop for a really long time.