
Time to Upgrade Your Monitor - neonbones
https://tonsky.me/blog/monitors/
======
the_af
The technical details are all right (or seem right to me, anyway), but this is
too opinionated for my liking. No, you do not _need_ a 4K monitor for software
development. Some people might like them, some won't. [edit/clarification:
someone rightfully pointed out that nobody will actively _dislike_ a 4K
monitor. I was unclear here: I meant "some people won't need them" more than
"dislike them"]

This sounds like when Jeff Atwood started that fad that if you didn't have
three external monitors (yes, _three_ ) then your setup was suboptimal and you
should be ashamed of yourself.

No. Just no. The best developers I've known wrote code with tiny laptops with
poor 1366x768 displays. They didn't think it was an impediment. Now I'm typing
this on one of these displays, and it's terrible and I hate it (I usually use
an external 1080p monitor), but it's also no big deal.

A 1080p monitor is enough for me. I don't need a 4K monitor. I like how it
renders the font. We can argue all day about clear font rendering techniques
and whatnot, but if it looks good enough for me and many others, why bother?

~~~
thomaslord
Hello! Person who actively dislikes 4k here. In my experience:

1\. No matter what operating system you're on, you'll eventually run into an
application that doesn't render in high dpi mode. Depending on the OS that can
mean it renders tiny, or that the whole things is super ugly and pixelated
(WAY worse than on a native 1080p display)

2\. If the 4k screen is on your laptop, good luck ever having a decent
experience plugging in a 1080p monitor. Also good luck having anyone's random
spare monitor be 4k.

3\. Configuring my preferred linux environment to work with 4k is either
impossible or just super time consuming. I use i3 and it adds way more
productivity to my workflow than "My fonts are almost imperceptively sharper"
ever could

My setup is 2x24" 1920x1200 monitors - so I get slightly more vertical pixels
than true 1080p, but in the form of screen real estate rather than improved
density. I also have 20/20 vision as of the last time I was tested.

My argument in favor of 1080p is that I find text to just be... completely
readable. At various sizes, in various fonts, whatever syntax highlighting
colors you want to use. Can you see the pixels in the font on my 24" 1080p
monitor if you put your face 3" from the screen? Absolutely. Do I notice them
day to day? Absolutely not.

I genuinely think 4k provides no real benefit to me as a developer unless the
screen is 27" or higher, because increased pixel density just isn't required.
If more pixels meant slightly higher density but also came with more usable
screen real estate, that'd be what made the difference for me.

~~~
ogre_codes
> 1\. No matter what operating system you're on, you'll eventually run into an
> application that doesn't render in high dpi mode

You lost me right here on line 1.

If there are apps on MacOS that can't handle high dpi mode, I haven't run into
them as a developer (or doing photo editing, video editing, plus whatever
other hobbies I do). Also, I don't have any trouble with plugging my highDPI
MacBook into a crappy 1080p display at work.

> 3\. Configuring my preferred linux environment to work with 4k is either
> impossible or just super time consuming.

Things like this are exactly why I left Linux for MacOS. _I absolutely get why
you might want to stick with Linux_ , but this is a Linux + HighDPI issue
(maybe a Windows + highDPI issue also), not a general case.

> I genuinely think 4k provides no real benefit to me as a developer unless
> the screen is 27" or higher, because increased pixel density just isn't
> required.

You could say the same for any arbitrary DPI; 96dpi isn't "Required", we got
by fine with 72dpi. It's all about ergonomics as far as I'm concerned.

~~~
nvarsj
> Also, I don't have any trouble with plugging my highDPI MacBook into a
> crappy 1080p display at work.

Low DPI monitors are pretty much unusable since MacOS dropped subpixel
rendering - with fonts being a blurry mess. You can only really use MacOS with
high DPI monitors now for all day working. It’s a huge problem for everyone I
know who wants to plug their MacBook into a normal DPI display. Not that the
subpixel/hinting was ever that good - Linux has always had much better font
rendering in my opinion across a wider range of displays.

~~~
thruhiker
This is an interesting position. I have always thought that font and font
rendering were always an especially pernicious issue with Linux and a relative
joy on MacOS?

~~~
nvarsj
I think that is a historical artifact. Ubuntu had a set of patches for
freetype called infinality, developed around mid 2010, which dramatically
improved font rendering. Since then, most of those improvements have been
adopted and improved in upstream. [1] Any relatively modern Linux desktop
should have very good font rendering.

[1]: [https://www.freetype.org/freetype2/docs/subpixel-
hinting.htm...](https://www.freetype.org/freetype2/docs/subpixel-hinting.html)

~~~
sukilot
Except for the hidpi inconsistency across apps and window managers

------
Waterluvian
I've found my journey through tooling evolution to be fascinating (to me at
least).

When I was new and coding was slow, my worries were about so many other things
(like how the language even worked) than even what text editor I used, let
alone the monitor.

As I became experienced, a lot of newbie problems went away and the next
topmost problems emerged, leading to be obsessing over text editor, shortcuts,
and even a multi-monitor setup with a vertical monitor.

Then went through a minimalism phase where I was annoyed by how much time I
was spending maintaining my tools rather than using them, so gone went almost
all of that. Just one giant monitor and VSCode for better or worse (mainly
because it does all six languages I use well enough).

I'm now at a phase of thinking, "who are all these people who do so much
coding in a day that these things matter?" because reading and writing code is
maybe.... 35% of my job now? What I need to optimize for is reading/writing
human text. And that's where I currently am: figuring out how to optimize
writing documentation/design/architecture docs, given how awful making and
maintaining a sequence or block diagram is currently.

My conclusion so far is that I expect my needs to continue to mutate. I do not
believe they are "converging" and do not believe there is any sort of golden
setup I will one day discover.

~~~
asdff
Some of my colleagues love a vertical monitor for writing and reading
documents.

~~~
Waterluvian
I loved it too on a mac. When I switched companies and went to an Ubuntu-based
laptop, trying to keep the whole monitor scenario working consistently was
frustrating, so I dropped it.

~~~
andrewzah
To echo the sibling comment: `arandr` makes it easy to save monitor
configurations (as a script). Then you can run that script on startup. I.e.
for Xorg I put it in my ~/.xinitrc.

------
chrismorgan
Concerning the smoothing, I presume this is macOS’s glyph dilation. pcwalton
reverse engineered it in order to replicate it in Pathfinder (so that it can
render text exactly like Core Text does), concluding that the glyph dilation
is “min(vec2(0.3px), vec2(0.015125, 0.0121) * S) where S is the font size in
px”
([https://twitter.com/pcwalton/status/918991457532354560](https://twitter.com/pcwalton/status/918991457532354560)).
Fun times.

Trouble is, people then get _used_ to macOS’s font rendering, so that (a) they
don’t want to turn it off, because it’s different from what they’re used to,
and (b) start designing for it. I’m convinced this is a large part of the
reason why people deploy so many websites with body text at weight 300 rather
than 400, because macOS makes that tolerable because it makes it bolder.
Meanwhile people using other operating systems that obey what the font said
are left with unpleasantly thin text.

~~~
torb-xyz
I tried turning off smoothing and frankly the text was uncomfortably thin to
read. I guess maybe their font should just be bolder by default so you don't
need the glyph dilation to compensate.

~~~
chrismorgan
I didn’t actually know you _could_ turn it off before today! I’ve never had a
Mac, I’m used to FreeType and Windows font rendering.

I have a vague feeling that Macs also have somewhat different gamma treatment
from other platforms, which would be likely to contribute to thickness
perception.

------
orestis
Do all that, of course. But when you’re writing applications for other people,
go buy a shitty middle of the pack laptop, resolution 1280x768, with the
windows start bar and menu bars eating like 150pixels of that, and try to use
your software there.

~~~
godelski
While we're on it, make sure this is a low end laptop in terms of components
too. Make sure your application is usable. No one needs a chat client that
eats several gigs of RAM and a significant portion of your CPU.

~~~
disposedtrolley
Yeah, more teams need to do this and consider performance on low-end devices
to be a feature.

When Android Kit Kat was under development, the team used nerfed Nexus 4's to
simulate a low end Android phone.

[https://en.wikipedia.org/wiki/Android_KitKat#Development](https://en.wikipedia.org/wiki/Android_KitKat#Development)

------
RHSeeger
> A good monitor is essential for that. Not nice to have. A MUST. And in
> “good” I mean, as good as you can get.

The whole discussion starts from a what I would say is an incorrect assertion,
and then goes on to describe lots of ways that letters can be made better. If,
in fact, you don't need better, then a lot of the points go away.

1\. There is a point of diminishing returns, past which you are spending a lot
more money for very little benefit.

2\. There exists a point beyond which "better letters" are unlikely to
contribute much to daily work.

Both of those points are, to some extent, the same point. But either way, the
idea that you MUST get as good of a monitor as you can" is, in my opinion,
untrue and not worth basing an entire document on.

I'd rather see a discussion of which features of a monitor contribute the most
(per $) to how well they function for daily work. For gaming, I want high
refresh rate and high contrast. For TV, the contrast (real black) goes up in
importance. For daily work, neither of those is a huge contributor to how well
I can work.

~~~
sydd
While technically the article is correct, it feels very narrow-sighted to me
by placing resolution/refresh rate above everything else. Some more things to
consider:

\- Price. If you are not a programmer in a first world country working for a
magic money startup/big 4 then you wont spend $1000+ on a single monitor. Mine
was a better 27" 1920 one for $300 and I'm happy with it. I'm sure there are
better ones, but I cant allow myself spending much more on one.

\- No PWM. This is crucial. PWMing monitors kill your eye in hours, then you
wont be able to work for the rest of the day. Some ppl I know take the eye-
protection game so far that they by e-ink screens for development (sadly you
cant buy large ones.).

\- Adjustable, optionally can be rotated 90 degrees. If you got 2 monitors one
rotated at 90 degrees is a blessing for reading code.

\- small bezels. Quite important if you got 2 or more, large bezels make my
setup clumsy.

\- Bright. Its quite annoying if I cant read something because the sun shines
in my room. Especially important for laptops.

None of these are mentioned, just that small letters should look nice and
scrolling should be smooth.

~~~
esperent
I think pwm is fairly rare these days. I have two dirt cheap AOC 24" 1080p
monitors (about $150 each) and they don't use pwm.

~~~
boplicity
Macbook Pros use PWM, which is why I had to return the latest 16 inch model. I
was really bummed; but I need to protect my eyes. It was causing severe eye
strain.

------
coreai
After using a rMBP for 6 years, I realized that using a lower resolution and
lower quality display makes absolutely no sense at all if both graphical power
and budget is available (first 13' rMBP had some serious issue driving the
display). Better quality image is better quality image. I think Apple's
biggest selling point over any vendor right now, despite numerous issues with
its hardware and software in the recent past, is absolutely top class input
and output. It is such a simple concept. A great keyboard (seems to be fixed
now) and absolutely incredible trackpad experience along with a display that
basically is a huge step up from your past experience means that most users
will prefer that setup even if they just use it for basic coding or web
browsing. After looking at the first retina displays I realized that Apple
didn't just change the displays but it changed how fonts behaved completely
because the crisp and clear legibility was key to attract customers early on.
I'd say even in 2020 most computers are struggling with good displays which
can completely ruin the experience for someone using the product even if every
other aspect of it was great.

~~~
bredren
I've been focused on Apple display products for the past few months as I'm
looking to make an upgrade from the Dell P2715Q 4k 27" I use primarily for
development.

There is a three page thread on using the Apple 32" XDR for software dev on
MacRumors. [1]

I believe there is a major product gap at Apple right now in the mid-market
display. Specifically a replacement for the LED 27" Cinema Display which was
announced 10 years ago next month. [2]

I am speculating that Apple could announce a new 27" 5k in part because of the
rumored announcement of a new Mac Pro but also because the build quality of
the LG 5k Ultrafine is just not great and there are obvious synergies with XDR
production and Mac Pro.

I think this should be announced at WWDC is because developers specifically
are being left out of good display products and Apple should be looking out
for what they stare at all day.

While there are no supply chain rumors of such a display, I wargamed what this
product might be and its pricing anyway.[3]

In short, I speculate Apple will release a 27 inch IPS display, 5120 x 2880 @
60Hz with standard glass at $1999, Nano-texture Glass at $2799.

I had not paid a lot of attention to the refresh rate, but it does seem like
kind of a miss that the XDR does nor offer this.

\---

[1] [https://forums.macrumors.com/threads/xdr-for-software-
dev.22...](https://forums.macrumors.com/threads/xdr-for-software-dev.2220838/)

[2] [https://www.apple.com/newsroom/2010/07/27Apple-Unveils-
New-2...](https://www.apple.com/newsroom/2010/07/27Apple-Unveils-New-27-inch-
LED-Cinema-Display/)

[3] [https://forums.macrumors.com/threads/wishful-thinking-
wwdc-d...](https://forums.macrumors.com/threads/wishful-thinking-wwdc-debuts-
new-apple-mid-market-display.2235940/post-28555359)

~~~
rootusrootus
> 27 inch IPS display, 5120 x 2880 @ 60Hz with standard glass at $1999

That's $200 more than it costs with the computer built in. Would rather see it
more aggressively priced, or at least bring back target mode.

~~~
bredren
This is an important point. I have a pal with an aging 5k imac that he wishes
he could use only as a monitor with a new mac mini.

It seems display tech in imac is severely discounted in order to sell the
whole thing. And it does seem to break the pricing if you don't see them as
different products offering value for different configurations.

Bringing back target mode, or allowing the iMac to act as an external monitor
would greatly increase the value of that product.

I can't explain how this pricing makes sense exactly, except that I can only
assume Apple will want or need to price this stuff high to help differentiate
the build quality in comparison to the collaboration on LG's ultrafine.

~~~
freeqaz
I bet somebody could make a business refurbishing those old 5K iMacs into
usable monitors. Either yank the panel into a new case w/ a custom board to
drive the display, or hack an input into the existing system (maybe with a
minimal booting OS).

Either way, that'd be a cool product to see and seems like a decent side
hustle to get some $. :)

~~~
udp
I recently bought a 2015 27” 5K (5120x2880) iMac core i5 for cheap, put in an
SSD, and upped the RAM to 24 GB. It handles everything I can throw at it as a
developer (albeit not a game developer, just “full stack” web/java/node/k8s)
and the screen is just incredible.

The SSD upgrade is not for the faint hearted, however. I used the iFixit kit
with the provided “tools”, and lifting off 27” of extremely expensive and
heavy glass that sounded like it might crack at any moment was not exactly
fun. Having said that - I would do it again in a heartbeat to get another
computer/display like this for the price I paid.

With regards to scaling: I have had zero problems with either my rMBP (since
2013) or this iMac, when in macOS running at high resolution. Zero. As soon as
I boot Windows or Linux, however, it’s basically unusable unless I turn the
resolution down.

------
speeder
I still use a CRT, because features I want are still ludicrous expensive in
newer tech (although they are getting cheaper over time).

1\. Arbitrary resolutions, great to run old software and games, even better to
run new games at lower resolution to increase performance.

2\. Arbitrary refresh rates.

3\. Zero (literally) response time.

4\. Awesome color range (many modern screens still are 12bit, meanwhile
silicon graphics had a CRT screen in 1995 that was 48bit)

5\. No need to fiddle with contrast and brightness all the time when I switch
between a mostly light or mostly dark content, for example I gave up on my
last attempt to use flat panel because I couldn't play Witcher 3 and SuperHot
one after the other, whenever I adjusted settings to make one game playable
the other became just splotches (for example the optimal settings for Witcher
3 made SuperHot become a almost flat white screen, completely impossible to
play).

6\. For me, reading raster fonts on CRT gives much less eyestrain and is more
beautiful than many fonts that need subpixel antialias on flat panels.

7\. Those things are crazy resilient, I still have some working screens from
80286 era (granted, the colors are getting weird now with aging phosphors),
while some of my new flatpanels failed within 2 years with no repair possible.

~~~
gruez
>many modern screens still are 12bit

Unless you're using a really cheap screen, most LCDs are 8 bits per color. The
cheap ones use 6 bits + 2 bits FRC. 12 bits (presumably 4 bits per color)
seems insanely low because it would mean only 16 possible shades per primary
color.

>meanwhile silicon graphics had a CRT screen in 1995 that was 48bit

Source for this? 10 bit color support in graphics cards only became available
around 2015. I find it hard to believe that there was 48 bits (presumably 16
bit per color) back in 1995, where 256 color monitors were common. Moreover,
is there even a point of using 16 bits per color? We've only switched to 10
bit because of HDR.

>7\. Those things are crazy resilient, I still have some working screens from
80286 era (granted, the colors are getting weird now with aging phosphors),
while some of my new flatpanels failed within 2 years with no repair possible.

This sounds like standard bathtub curve/survivor bias to me.

~~~
Dylan16807
> 10 bit color support in graphics cards only became available around 2015.

That's off by a decade.

> 256 color monitors

Is that a thing that exists?

> We've only switched to 10 bit because of HDR.

You can get clear banding on an 8 bit output, and 10 bit displays are used at
the high end. 10-bit HDR isn't immune to banding, since most of the increased
coding space goes into expanding the range. There's a good reason for 12 bit
HDR to exist.

~~~
gruez
>That's off by a decade.

mainstream support, at least. 10 bit HDR support for AMD cards was introduced
with Fiji (2015), and Nvidia was introduced with Maxwell (2014)

~~~
tomaskafka
2002\. I remember this. Matrox was one of major GPU players at the time.
[https://en.wikipedia.org/wiki/Matrox_Parhelia](https://en.wikipedia.org/wiki/Matrox_Parhelia)

~~~
gruez
So it looks like it supported 10 bit _frame buffers_ , but not necessarily 10
bit output. A quick search suggests it only supported DVI, which didn't
support 10 bit output. In the context of talking about monitors, this
essentially means that 10 bit wasn't supported at the time. Otherwise you
could claim you have 192 bit color by software rendering at 64 bits per color,
but outputting at 8 bits.

------
Slartie
Regarding the "120Hz dance", which sure is ridiculous, the author could
probably give the nice little tool "SwitchResX" a try. I adore that piece of
software because it allowed me to force my MacBook to properly supply my Dell
U2711, which is a 2560x1440 display, with an actual 2560x1440x60Hz signal over
HDMI (which was important to me because I needed the DP port for something
else).

That older monitor has some kind of firmware bug or maybe it's a wrong profile
in MacOS or whatever, which makes it advertise a maximum of 2560x1440x30Hz or
1920x1080x60Hz to the Mac when connected via HDMI (DP works fine out-of-the-
box), effectively preventing me from choosing native resolution at maximum
refresh rate. I haven't been able to make MacOS override this limitation in
any way using the common OS-native tricks, but SwitchResX can somehow force
any custom resolution and refresh rate to be used, and the monitor is
apparently able to deal with it just fine, so I've been running this setup for
years now with no complaints whatsoever.

Also no manual work was ever needed after display disconnect/reconnect or
MacOS reboot. I had problems once after a MacOS update, which required a
SwitchResX update for it to be working again, but other than that I'm in love
with this nifty low-level tool.

~~~
mthoms
I had a similar problem with an older MacBook that would only (officially)
power my 4k display at 30Hz. SwitchResX was the magic solution to bring it up
to 60Hz.

------
JohnBooty
I found this article utterly baffling. The author clearly knows their stuff,
having created Fira Code, but my experiences couldn't be more different.

    
    
        I spend most of my days in a text browser, text 
        editor and text terminal, looking at barely moving 
        letters.
    
        So I optimize my setup to showing really, really 
        good letters.
    

I certainly appreciate how _nice_ text looks on a high DPI display.

But for coding purposes!? I don't find high DPI text more legible... unless
we're talking _really_ tiny font sizes, smaller than just about anybody would
ever use for coding.

And there's a big "screen real estate" penalty with high DPI displays and 2X
scaling. As the author notes, this leaves you with something like 1440×900 or
1920x1080 of logical real estate. Neither of which is _remotely_ suitable for
development work IMO.

    
    
        But at least you can enjoy that gorgeous screen and 
        pixel-crisp fonts. Otherwise, why would you buy a retina 
        screen at all?
    

It's not like you really have the option on Macs and many other higher-end
laptops these days. And I am buying a computer to do some work, not admire the
beautiful curves of a nice crisp font.

So anyway, for me, I chose the Dell 3818. 38", 3840 x 1600 of glorious, low-
resolution text. A coder's paradise.

For purposes of software development, I won't really be interested until 8K
displays become affordable and feasible. As the author notes, they integer
scale perfectly to 2560×1440. Now that would rock.

~~~
ascar
Well, it's really about personal preference and while I know a couple of
collegues with your preference, I feel it's the minority.

I take two 16:9 screens over one 3840x1600 screen anytime. No need to setup
some additional inner-mintor split-window management. Split-monitor management
and workspaces works very well and I can even turn the 2nd monitor off, when I
don't need it and want full focus mode (i.e. reading). Also I prefer my main
monitor exactly in the middle in front of me and an actual 2nd monitor to the
right. If I have the luxury of a 3rd monitor, it's to the left (usually the
laptop that powers the 2 big screens). Setting one half of an ultra-wide in
the middle just feels wrong. And splitting in 3 is too small for me and again
the inner-monitor window management issue.

While I also strongly believed my old 1920x1080 24" (~92ppi) screens were
good, I had the opportunity to use qhd 27" (~110ppi) screens for 3 months
abroad and I was baffled when going home how incredibly bad text looked on my
old 24" monitors.

~~~
slg
The benefit of a 3840x1600 screen over two monitors with lower resolution is
that 3840/2 = 1920 and 3840/3 = 1280. Those are horizontal resolutions for
1080p and 720p respectively. The fact that these are standard resolutions
means basically every app works as designed regardless of whether you have 2
or 3 windows side by side. This isn't true for ultrawides with resolutions
like 3440x1440 that don't divide cleanly into other standard resolutions.

The default software that comes with the Dell mentioned above handles
everything. If I want to simulate two 1080p monitors, I just do the standards
Windows drag to the side of the screen. If I want to simulate three 720p
monitors I can press shift while dragging and that tells the Dell software to
take over. It is more versatile than having individual monitors.

~~~
ascar
> The default software that comes with the Dell mentioned above handles
> everything.

Does it really handle everything? Does it handle all (or even just a single
one?) of the use cases I mentioned in my comment and that I care about. Can I
turn of part of the screen, if I actually only need smaller space? Can I
position a 1920 default width space right in the middle in front of me without
it looking weird (i.e. symmetry of screen hardware borders)? Are workspaces
working correctly (in unix, windows, macos, all of them?). Just splitting
monitors isn't everything, I heavily use workspaces to switch between context.

If all of this works (except the obvious middle problem that's phyiscally
impossible to solve) I might actually consider it.

~~~
slg
>Does it really handle everything?

It does for my use case, but obvious your mileage may vary.

>Can I turn of part of the screen, if I actually only need smaller space?

Sort of. The monitor can split into dual source mode that has two 1920 sources
side by side. You could potentially turn that on and set one side to an empty
source. You can also always use a black desktop if you need the rest of the
monitor to be dark to allow you to focus on whatever window you have open.

>Can I position a 1920 default width space right in the middle in front of me
without it looking weird (i.e. symmetry of screen hardware borders)?

How do you accomplish this with two monitors currently? You would have to
choose between symmetry or having one monitor front and center. The Dell
software allows you to customize the drag and drop regions. I use three
columns that are the full height of the screen. You could set it up to have a
1920 section in the middle with two 960 columns on each side. You could also
setup your physical workspace so one side of the monitor is centered in your
vision instead of the center of the monitor. Also I have mine mounted on a
monitor arm that allows me to reposition it as needed.

> Are workspaces working correctly (in unix, windows, macos, all of them?)

It works in Windows and that is the only native GUI I use. Everything
obviously works fine when in the terminal and I rarely increase the resolution
of a VM past 1920. I would frankly be shocked if there wasn't similar software
available for Linux and OSX that allowed you to setup customized zones like
Dell's software if you need to run one of those natively.

------
_bxg1
Interesting info, but overly opinionated. I tried turning off font smoothing
just now but the lack of boldness made everything appear dimmer, and I had to
strain my eyes a bit when reading my code. Also I use my MacBook without a
separate screen, with no scaling, so I really have to disagree with this:

> This will make everything on the screen slightly bigger, leaving you
> (slightly!) less screen estate. This is expected. My opinion is, a notebook
> is a constrained environment by definition. Extra 15% won’t magically turn
> it into a huge comfortable desktop

For me a notebook is not a constrained environment, and screen scaling very
much makes the difference between "a huge comfortable desktop" and not.

------
dragosmocrii
This post is from the author of Fira Code, a font with ligatures that is
incredibly pleasant to use. Highly recommend to give it a try if you haven't
already!

Shameless plug: I mentioned this as my favorite font for code editors in my
blog post [https://dragoshmocrii.com/my-favorite-tools-resources-for-
da...](https://dragoshmocrii.com/my-favorite-tools-resources-for-day-to-day-
software-development/)

~~~
rocky1138
Thanks for the heads-up. Personally, I'm not a huge ligature fan in my code as
my eyes tend to go directly to them instead of the code I want to read. It's
great that Fira Code comes with a non-ligature version.

~~~
dragosmocrii
You're welcome! I can see how that could be a distraction for some.
Personally, I enjoy the aesthetics. And as a bonus, I always get people
intrigued when I show someone the code with this font that they never saw
before. Nonetheless, in my opinion it's still an awesome monospace font even
without the ligatures.

------
dhosek
I used to long for the days of 200dpi monitors when I was younger and we lived
with 72dpi or less on-screen. Now, my vision has deteriorated to the point
where I honestly can't see the difference despite being a type nerd. I'm
increasingly using the zoom feature in MacOS to read smaller UI elements. I
suspect my days of driving my MacBook display at "more space" (2048x1280)
instead of "default" (1792x1120) are numbered.

------
mastercheif
It's frustrating that consumer Hi-DPI display development has completely
stalled in the last five years.

As mentioned in the article, macOS works best using true 2x integer scaling.
Using 2x scaling, a 4k monitor will result in an equivalent workspace to a
1080p display—unusable for productivity and development. A superior option is
iMac/LG 5k display results which nets an equivalent workspace of 1440p.

The only options for greater-than-4k displays on the market currently are the
27" LG 5K ($1200, 2014) and the Dell 8k ($3500, 2017). I'm convinced this is
due to the manufacturers focusing their attention on the high-margin gaming
market which has no need for resolutions higher than 4k.

I'm holding onto my 30" 2560x1600 Apple Cinema Display until I can get
something that offers me the equivalent amount of screen real estate at 2x
retina scaling.

~~~
aplummer
I use a Pro Display XDR for programming / design work, I think you forgot this
one. It seems like it would fit your needs (with the obviously premium price)
as it’s 32”.

It’s overkill for most, but still the best display I’ve ever used and I love
it. I actually tried an Asus 4K 120hz for much less but the fan noise was a
deal breaker even if the screen didn’t break on arrival (it did).

Disclosure: Apple employee

~~~
PStamatiou
+1, love my XDR for coding and designing: [https://paulstamatiou.com/stuff-i-
use/](https://paulstamatiou.com/stuff-i-use/)

~~~
bredren
Thanks for this. You don't comment on the 60 vs 120 hz that is highlighted in
OP opinion piece. Any thoughts on XDRs apparent lack of support for 120?

~~~
PStamatiou
Color accuracy is far more important to me than 120hz capable. The faster
displays today tend to be subpar at color accuracy, illumination uniformity,
etc.

------
gilbetron
Been developing software for 30+ years, with up to 2 extra monitors, and I
always go back to a single display. I'll put myself up against anybody with
multiple displays. The thing is, I just alt-tab (or otherwise) to different
things - it's not like I can actually perceive more than 1 window at a time,
for the most part. The head twisting to see multiple displays bugs me, as does
the eye movements across a huge display. Plus there is just more opportunities
for distractions with more display area.

Finally, I work most effectively when I hold what I need in my head, and find
that multiple displays fights against that need.

The only real exception to that is if I'm writing front end code of some sort,
having the live updates helps a ton.

Lastly, if I could have a bigger display on my laptop, I'd be up for that!

~~~
waheoo
I used to feel this way. I found it depends on the work.

A lot of my old work didnt depend heavily on looking at a terminal output
constantly.

These days, i have to have about 3-4 things running. The extra screen just
becomes a log window.

You can argue the software should be redesigned, i do too, but the problem for
me remains and a second screen while not solving it, alleviates it a lot.

It might have something to do with using a twm, might not.

All i know is that i dont miss the days of tab cycling and desktop flipping.

~~~
chiefsucker
Tab cycling can be a productivity killer, especially when you need to cycle
between more than 4–5 applications (which can happen quite often, but OP wrote
“alt-tab (or otherwise)”.

What works excellent (at least for me, not necessary for you) is to set
shortcuts to your most used applications (be it the function key row, or some
hjkl combinations). With such a setup you can just press a key / combination
from muscle memory, instead of opening the tab switcher overlay, parsing it,
deciding how often you need to press tab, and then actually switching to the
other application window. Keeping shortcuts to applications feels like you
would only have to do the last step.

------
quicklime
The author says that you don't need to choose between 4K and 120Hz, because
there are "reasonably priced" monitors on the market that have both. But their
recommended monitor ($900) seems a lot more expensive than a 4K 60Hz monitor
(which is around $300-$400 when I search on Amazon).

The author also says (they think) that you need the discrete graphics card in
a MacBook to take advantage of a 120 Hz display, and that the integrated Intel
Iris graphics won't do.

I was actually shopping around for a new MacBook earlier today, and noticed
that the 13" MacBook Pros only come with integrated GPUs, and you have to get
the 16" MBP if you want discrete graphics. So if you like the portability that
a 13" laptop has, this might not be a good tradeoff (not to mention that the
16" one costs $400 to $800 more than the top-spec 13" one).

I totally agree with the stuff they say about rendering fonts on 4k screens,
but I don't think I'd be willing to take the hit in portability, or shell out
an extra $700+, to get 120 Hz.

~~~
madeofpalk
For what it’s worth, the latest 13” MacBook Pro (with integrated intel
graphics) can power the 6016x3384 Pro Display XHR at 60Hz.

I am unsure what refresh rates it can achieve with a lower resolution.

But really as someone who made the jump from 60Hz to 120Hz, I wouldn’t bother
unless you’re gaming. I accidentally was running my monitor locked to 60Hz and
when I fixed it, I barely noticed the difference - it was only really when
playing FPS that I noticed.

~~~
quicklime
Yeah you're right, it can do 6K at 60Hz, and according to Apple it can also
do:

> Up to two external 4K displays with 4096‑by‑2304 resolution at 60Hz in
> millions of colours

So I guess if it can power 2 monitors at 60Hz, it should be able to do 1
monitor at 120hz, right?

------
randallsquared
I fully agree that higher DPI is better.

I disagree that all available resolution should go to increasing sharpness and
detail of text, which, after all, can only improve so much. There's only one
metric that really matters for text, and that's reading speed. Any beauty or
detail which doesn't serve to differentiate characters (thereby increasing
reading speed) is essentially pointless.

A better reason to get a better monitor is to fit more text on it. While
coding or learning, there might be one or more editors, terminals, browsers,
chats, etc. Can all this be behind tabs or in virtual desktops? Of course. But
the value of having it all available at a flick of the eyes is so much higher!
Even if you turn off all the animations, flipping virtual desktops or tabs
requires a much more severe context switch than casting a glance to the right,
left, or (slightly) up or down. I'm currently using a 5k2k monitor at its
native resolution, and it's pretty awesome.

I have been hoping for two decades that VR would solve this, allowing me to
have a truly vast workspace that moved and zoomed just via eye movement, but
not only is that day still far away in eye-tracking terms, VR goggles have so
far all been much worse for text than high DPI screens, in my experience. I
appreciate that there are VR workspaces in development, though, like Immersed,
for the day when the hardware is good enough.

------
kstenerud
He puts up some nice technical arguments, and I'm sure a 4k monitor is right
for him, but really it comes down to what you're actually comfortable with. I
stopped worrying about resolution the moment I got my first 19" (CRT) monitor.
Resolution was high enough to fit as much text as I needed, and really that's
all that ever mattered. I've used retina displays but I've never felt they
make enough of a difference to matter. I've worked on 32" displays, and while
they're nice, I've never felt like I'm missing real estate when I move back to
a 1080p display.

Font smoothing has never bothered me, but what DOES bother me is this trend
towards lower contrast, and "flat UIs" that lack any differentiation between
sections in the UI. Win95, while ugly by today's standards, was easy to read
and follow. And the black-on-white text is something I miss dearly.

------
gdubs
Good article.

Have to say though, taste is subjective, but good lord those monitors are
about as elegantly designed as a Rockstar Energy drink.

~~~
santoshalper
Yeah, for some reason these are all coming from "gaming" monitor makers. I
think the last one is trying to conjure up a "pro" vibe with the hood, but yes
- these are definitely tacky.

Acer, Asus, et al. tend to just remarket OEM technology with relatively little
value-add. I suspect these are all built on the same panel, but I could be
wrong.

~~~
pmiller2
Oh! Is that what those funny looking "hoods" are about? I thought for a second
they might be functional, but I guess they're just decorative? "Gaming"
equipment (monitors, chairs, mice, keyboards, _etc._ ) rarely seems worth it
to me, simply because they spend too much effort and cost on things that are
non-functional, but make the thing look attractive to the target audience. I
am not a part of that audience, so I find the esthetic rather garish most of
the time.

The one exception for me is that I do like a couple of gaming mice. It's not
so much for the hotkey functionality, but because they have a really high
resolution, which makes mousing easier and more precise to me. Other than
that, my setup is all really basic, office-style equipment, and I love it.

~~~
amarshall
Hoods are for reducing reflections on the screen. They are functional, and are
not (at least originally) a “gaming” feature. Somewhat common on high-end
color-accurate displays. How useful they are depends on the lighting.

The monitors linked are actually quite good, the gaming styling they have is
rather unfortunate.

------
alkonaut
4K at 200% scaling feels wrong for a big monitor like a 27". The desktop looks
like a 1080 desktop. And if the monitor is to be used as a gaming monitor too
then 4K is again wrong because even with the best graphics cards, 4K@120 is
out of reach.

The perfect 27” monitor for me would be one that has 5K@120 because at 200%
scale you’d get a desktop that looks “1440-size” and when gaming you can run
half resolution at 1440@120 with a decent graphics card.

These don’t exist though, at least not as IPS I think, so I’ll stick with my
1440@144 for a while longer.

~~~
metafunctor
The LG UltraFine is a 27-inch 5K (5120 x 2880) IPS display. The maximum
refresh rate is 60 Hz, so not the best for gaming; but it's designed for a Mac
anyway. I've had a couple for over three years now and it's the best display
for a Mac I know of, outside of Pro Display XDR, I suppose.

~~~
alkonaut
Yeah an alternative is always to just keep the screen I have now (1440@144)
for gaming, and simply get a second screen for work with High DPI. Seems silly
to get a 5K@120Hz mega expensive screen to play games at half resolution. I'll
buy a bigger desk instead.

------
steadicat
The article is all based around this premise:

> Times of low-resolution displays are over. High-resolution displays are a
> commodity now.

I reject that premise. 4K is basically the maximum number of pixels you can
get for less than $1K, and that's only a handful more pixels than a 16" MBP
screen. It does not make sense for most people to shell out the $$$ just so
that they can see the same number of pixels from slightly further away.

It is absolutely not the right time to upgrade your monitor. Wait until 5K+
becomes affordable again[1] so you can actually get the benefits of an
external monitor (more usable screen space) without having to sacrifice the
sharpness and quality of retina.

[1] [https://tidbits.com/2018/11/16/what-happened-
to-5k-displays/](https://tidbits.com/2018/11/16/what-happened-to-5k-displays/)

------
gcbw3
> Text can’t be made look good on low-resolution displays.

BS. Maybe true on user-hostile OSX/windows. But If LCD manufacturers provided
the correct information on the DID data, it would be very trivial.

On linux and with a little trial and error (you only have to do it once per
monitor ever) you can fine tune the subpixel hinting. I use a photographer
loupe (magnifying glass) to look at the a white region on my ancient LCDs to
see the subpixel configuration, set it on my X config and have perfect aliased
text just fine. ...well, except on some gtk2 applications :) But if you work
more than a few minutes on those you have other problems.

------
tjr225
I will never give up my 24" 16:10 Dell Ultrasharp even if it is low def.

~~~
foepys
I have one, too. It's pretty sad that 16:10 has fallen out of style and that
there is no 27" 2560x1600 display available.

Dell has the Ultrasharp UP3017 with 2560x1600 at 30", maybe that's something
you can try if you want more pixels. It's just pretty expensive and has a big
bezel since it was released in 2017.

~~~
santoshalper
I did have a dream once of a "4k Pro" that was 4000x2500. Never gonna happen.

------
AdrianB1
I want to thank the author for the article, it comes at the perfect time for
me. Earlier today I just had a main power line incident that resulted in
burning almost half of all my electric equipment in the house, so I need to
buy immediately at least one monitor and I was in doubt about spending too
much on it (I need to buy other stuff too) and I am a cheap bastard, but I am
moving a bit up versus what I was planning to get.

I saw some discussions about multiple monitors; I am a heavy user of multiple
monitors, 2 most of the time and 3 when really needed; I don't have space on
my desks (home & office) for 3 monitors all the time, so it depends on the
needs and not on coolness. What I plan to do now is to get an asymmetrical
configuration at home with one regular (1080p) monitor for some work and
gaming (the CPU is an i3 7100, so I don't do much gaming) and a larger and
better one for work only cases where a lot of information needs to be on the
screen at the same time. As the article did not mention multiple monitors and
buying 2-3 monitors that are 120 Hz/4k resolution is extremely expensive for a
home setup, I think it is worth mentioning this kind of compromise of mixing
size to have one of each. Not having OCD the different size is not such a big
deal, while the extra functionality/productivity helps.

~~~
fenesiistvan
Just read the comments. You will be confused again :)

------
elfchief
From the article:

> Why is it 119.88 Herz, not 120 Herz? No idea.

The short version of this is that TVs generally have framerates that are a
multiple of 59.94, because NTSC is actually 59.94Hz (rather than 60), for
various historical reasons -- basically, for color TV to become a thing,
without obsoleting every single B&W TV in existence at the time, they needed
to slow the framerate down by 0.06Hz[1]

There's still plenty of TV-targeted panels out there that only do 59.94Hz, and
not 60Hz. Some of them are even in computer monitors! I imagine likewise that
there are a lot of panels that do 119.88Hz (59.94Hz * 2) and not 120Hz.

And just to be confusing, a number of things (both hardware and software) tell
you that you have a 60Hz signal when you actually have a 59.94Hz signal! A
really good example is windows -- refresh rate settings in the OS get quite
interesting, showing 59.94 in some places, and 60 in other places![2]

[1]
[https://en.wikipedia.org/wiki/NTSC#Color_encoding](https://en.wikipedia.org/wiki/NTSC#Color_encoding)

[2] [https://support.microsoft.com/en-us/help/2006076/screen-
refr...](https://support.microsoft.com/en-us/help/2006076/screen-refresh-rate-
in-windows-does-not-apply-the-user-selected-settin)

------
mov_
This is a fantastic post, I just bought a 28", 158 dpi monitor recently and
spent quite a bit of time figuring out how to maximize the font quality.

Since there weren't any suggestions for linux, this is what I did for Ubuntu
20:

    
    
        * 2x scaling
        * Downloaded libfreetype, enabled subpixel rendering option, replaced existing libfreetype.so
          (I think Ubuntu 20 might already have this enabled, but just to be safe I did it manually)
        * Turned off all hinting options in font configuration
        * Turn on lcddefault filtering
          (necessary if you have subpixel rendering enabled)
        * Specifically set 158 dpi in Alacritty terminal config
          (Not entirely sure that's necessary?)

~~~
st-isidore
Thanks, I saved this to use later when I get a better monitor. How did you
devise these steps though if I may ask? And what are these things?

~~~
mov_
You're welcome, I hope it helps! To preface this, I think Ubuntu 20 comes with
good font rendering options by default, so my suggestions are mostly
unnecessary. I just wanted to experiment to see if I was missing anything. All
the options I'm referencing are explained in detail in this blog post (e.g.,
subpixel rendering and lcd filtering), they're all just configuration options
in the freetype2 font renderer.

I found that the Arch Linux wiki page
([https://wiki.archlinux.org/index.php/Font_configuration](https://wiki.archlinux.org/index.php/Font_configuration))
is helpful because it applies to other distributions like Ubuntu, and provides
example configuration files.
([https://wiki.archlinux.org/index.php/Font_configuration/Exam...](https://wiki.archlinux.org/index.php/Font_configuration/Examples))

------
asadkn
I have the same opinion as the author here. When I buy a 4k monitor, it's
generally for the higher PPI. There's just less eye strain for me reading less
pixelated fonts.

Personal sweet spot me in terms of pleasant readability is: 4k 24" @ 200%
scale (so same screen estate as 1080p) or 5k 27" @ 200% scale (same screen
estate as 1440p).

------
nycticorax
I find it so strange that most people still talk about their monitors'
resolution in terms of pixel counts. Oh, you have a 4k monitor? Well, if
that's a 43" 16:9 4k monitor, then that's pretty close to 100 dpi, so you
basically have a big-ass traditional ("lodpi") monitor. Whereas if it's a 24"
monitor, then it's a smallish hidpi screen. Rather than pixel counts, I find
it much more helpful if you tell me the size in inches and the DPI.

This is related to one reason why I'm not so crazy about 4k: At most of the
usual screen sizes, they occupy an unpleasant middle ground between a
traditional screen (~100 dpi) and a 2x hidpi/retina screen (~200 dpi).

~~~
g-b-r
All 3 values count, beyond a certain dpi pixels count much less, but before
they count a lot (they are the "shelves" for your characters and everything
else, given that sub-pixel simulation is very poor).

Unfortunately most monitors around are still at a limited 1920x1080 or less.

For people with poor vision big size counts a lot, good vision small might be
better, etc...

------
nsxwolf
I’m not willing to give up my 3840x1600 ultrawide. That real estate is now
non-negotiable to me. So to improve the dpi I’ll need 7680x3200, and at 120hz
I don’t think that’s happening any time soon.

~~~
JohnBooty
Same here. For coding work, I almost don't even see how it's worth discussing
-- sheer real estate is going to win for me, every single time.

I say "almost" because I've been doing this for nearly 25 years and I've
learned that other coders have shockingly diverse preferences and styles. But
still. Give me IPS, give me bucketloads of real estate. Everything else is
secondary.

------
needle0
The "Antialiased text in low PPI is better with hinting" argument is
unknowingly underlain by Eurocentric assumptions. Asian characters such as
Chinese or Japanese have always looked significantly better on the macOS style
of text rendering compared to Windows' ClearType, so much that there exists an
app called MacType, created by an Asian developer, to force Windows into using
Mac-style rendering.

------
bitbang
Interesting bit of trivia on why the frame rate is '119.88' instead of 120:
It's using the NTSC/ATSC frame-base of 1.001.

Ever since old TV standards moved from black and white to color within the US,
the timing has been "frames per one and one onethousandth of a second". So
120/1.001 = 119.88.

Was a neat trick in the analog days to maintain compatibility with the older
black and white TV sets. Fastforward to the digital age and that fractional
timing difference makes keeping audio and video sync an absolute nightmare.

------
tjoff
Strongly disagree. Any subpixel rendering on low-resolution displays must be
disabled. Otherwise you get color noise and blurry text. Pixelated fonts are
not bad.

It is fine on HIDPI displays though because it isn't as prominent there and
the effect actually works.

Instead of high resolution pick a display with a decent aspect ratio. 16:9 is
a joke. 16:10 is superior but not exactly a game changer. 3:2 is very rare,
especially for desktop displays, and 4:3 is pretty much dead. We do not talk
about 5:4.

------
kickingvegas
Oh wow, I guess I’ll add to this thread which is so tied to the fact that we
don’t have resolution independent graphics.

I wrote an essay about resolution independent graphics back in 2012 here
[https://github.com/kickingvegas/12pt-should-be-the-same-
ever...](https://github.com/kickingvegas/12pt-should-be-the-same-
everywhere/blob/master/absoluteMeasurementDPI.md)

HN commentary on this happened on two occasions: \-
[https://news.ycombinator.com/item?id=15639616](https://news.ycombinator.com/item?id=15639616)
\-
[https://news.ycombinator.com/item?id=4236429](https://news.ycombinator.com/item?id=4236429)

------
slavoingilizov
This is a great article. Even if you disagree with the conclusion and
recommendation, it shows what you need to look out for when buying a monitor.

I had to go through a similar exercise recently. The DisplayPort / USB-C /
Thunderbolt story is just insane. I was looking for a monitor to use with both
MacOS and Windows. HDMI can't do 4k @ 60Hz, DisplayPort cables never tell you
which version they support.

I ended up with LG UHD 4K 27UL650, which I'm very happy with, but switching
between Mac and Windows is still difficult, because this monitor only has 1
DisplayPort input. I've settled to using HDMI for Mac and 30Hz instead of
switching cables all the time.

Improvement in experience is very subjective but I agree with the author on
everything except maybe the 120Hz.

~~~
city41
Have you considered a KVM switch? I really like mine. It allows up to four
computers leaving me with two extras that sometimes come in handy (plugging in
someone else's laptop or sometimes I connect my Raspberry Pi on).

~~~
slavoingilizov
Which one are you using?

~~~
city41
I can't find the one I bought anymore. Seems they may have discontinued it. It
was basically the same as this one, except with 4 ports instead of 2

[https://www.amazon.com/JideTech-Keyboard-Security-
Education-...](https://www.amazon.com/JideTech-Keyboard-Security-Education-
Conference/dp/B07C5G1XFJ/ref=sr_1_2?dchild=1&keywords=JideTech+KVM+Switch+HDMI+4+Port+USB+Keyboard+Video+Mouse&qid=1593621353&sr=8-2)

FWIW, the first one I got was defective, but the replacement has worked like a
champ.

------
zarmin
>Lowercase letters only have 7 (seven!) vertical pixels to work with. That’s
NOT MUCH. I have more fingers than that.

Quite an argument.

~~~
wondringaloud
It was at this point I realized the entire premise of the article was a very
long-winded way of saying "I like using high DPI monitors when I code". And
then tossing in a bunch of technical details to make it sound like a "fact".

~~~
bostik
More like "I like looking at really smoothly rendered TTF fonts".

I've tried to use TTF fonts in terminal. None of them work well enough, or
renders better than the classic 6x13.

------
kazinator
I'm currently working from home on a 14-year-old ViewSonic VA902b.

19", 1280x1024, 75 Hz.

It was a freebie from a folded startup.

This exactly matches monitors I used in the more distant past, like NCD
X/Window terminals: 19", 1280x1024.

This form factor basically determines what is the ideal pixel size for most
work.

Basically, 1280 is about the right number of pixels for the angle of vision
subtended by the display when viewed at the intended distance.

~~~
kazinator
At work (I remember the place well!) I have a pair of 1920x1024's. What size?
Not sure. I rotated them 90 degrees, so there are two vertical panes of
1024x1920, side by side, making a 2048x1920 area.

Approximately square is the most productive size. When we're working with
text, be it code or prose, the height of it is unlimited, but the width isn't.
So it ends up in multiple columns.

Books are mostly oblong (taller than they are wide), but that's only if we
think of them in their closed state. An typical open book is typically a bit
wider than it is tall (but not greatly so), and you have two columns of text
(the two pages).

That open book geometry is an awful lot like my rotated monitor setup, isn't
it.

Until I rotated the monitors into this configuration, I was consistently just
using one and ignoring the other. Moreover, ironically, using virtual
desktops! I keep virtual desktops as large as 4x4. I would rather pane among
virtual desktops with hot keys than turn my head between two excessively wide
monitors placed side by side.

People made remarks about my disuse of the second monitor, which prompted me
to come up with a good way of using it.

------
Rapzid
Yeah, IDK. As nice as "retna" displays are I still use 1080p screens on my
desktop. At my viewing distance, and with sub pixel rendering, the smallest
text on Hacker News is perfectly legible and even clean looking.

Sure, if I get within 12 inches of the screen I can start to make out the
pixels. But even then due to sub-pixel rendering it's hard to make out
individual pixels at a foot if the contrast isn't too great between the text
and the background.

That being said interestingly one of my screens, a newer Asus that's precisely
one model version higher than the other, handles high-contrast sub-pixel text
rendering quite a bit worse than the other :|

------
fwipsy
Disappointed that he didn't mention pixel-perfect programming fonts (e.g.
proggy) which offer high-density text on low-resolution monitors while not
looking nearly as bad as his Consolas example. These fonts make a huge
difference imo.

~~~
jeromenerf
I also recommend them for the same reason. I use terminus. It looks great
whatever the size.

When writing prose, I prefer serif fonts such as go mono, at a bigger size.

~~~
q3k
Yeah, I've started using terminus... probably 10 years ago now, and I still
haven't found a programming font that I liked more. Tried using some retina
macbooks with hyped-font-of-the-month, they just didn't work for me.

------
CarVac
I prefer real-estate to sharp fonts, so I run my 4K screens (24" and 27") with
no scaling at all.

~~~
pier25
You must have super sharp vision

~~~
asdff
Most software lets you increase text size. This lets you maximize functional
resolution by letting ui elements remain small and text to remain legible. I
don't need my window header to be twice as thick.

------
chias
Only tangentially related, but encoding text into the subpixels themselves is
still one of my favorite github repos I've stumbled across in a good long
while :)

[https://github.com/graphitemaster/breaking_the_physical_limi...](https://github.com/graphitemaster/breaking_the_physical_limits_of_fonts)

------
city41
I have the Samsung CHG90, which is 3840x1080. It's literally a 4k TV cut in
half horizontally. It enables me to very comfortably display three windows
side by side. More than anything I have ever bought, this monitor has improved
my productivity. They now have a newer version that is 5120x1440.

The trade off is this monitor is very low res, about 100ppi. I don't know
anything about how the various OSes render text. But I have found that by far,
Ubuntu renders the text the best. Text in both OSX and Windows looks
absolutely dreadful on this monitor, but Ubuntu is really quite pleasant.
Which is not at all what I was expecting.

------
foolmeonce
I'll upgrade when color E-Ink is ready, has a good spectrum and has 20" for a
(near?) reasonable price.. I'm not sure who needs a high DPI Lite Brite, but
it's certainly not relevant at my age.

~~~
GaryNumanVevo
I would LOVE to code outside on a Color E-Ink display. It's fairly impractical
to work outside with modern screens.

------
pvorb
Those colorful upscaled images of ClearType text are not that helpful. I wrote
a tool several years ago to illustrate the effect of subpixel font rendering:
[https://github.com/pvorb/subpixel-
illustrator](https://github.com/pvorb/subpixel-illustrator)

------
nighthawk454
> Why is it 119.88 Herz, not 120 Herz? No idea. It seems to work the same.

I'm guessing it's 119.88Hz because that's an even 5x multiple of the
conventional "24Hz" of TV – which is really 23.976Hz.

~~~
DudeInBasement
na, they probably just used floating point and didn't care.

------
da39a3ee
The article is deliberately opinionated but I basically agree: the experience
of editing text on a 4k/Retina screen is vastly superior to what it was like
previously and I'd never want to go back.

It's clear that many of the people who think they disagree are Linux users who
have (for good reasons) not yet had a period of time to try it out with good
OS support. At least, not on their laptop screens. I really hope good Linux
support for laptops with high DPI displays emerges, if it has not already
(I've read mixed reports).

------
ulisesrmzroche
Wow $2000 USD for a monitor? I thought I was pushing it yesterday when I got a
new 27' monitor for $200.

I don't think as programmers we need anything more than a 27' inch 1080p. What
am I missing? I don't think it's worth the expense.

On another note, that mustard yellow as the background color is an eyesore.

~~~
huy-nguyen
$200 is quite cheap for a 27” monitor. A good non-gaming 1440p 27” costs about
$300-400 not on sale.

~~~
mcny
> $200 is quite cheap for a 27” monitor. A good non-gaming 1440p 27” costs
> about $300-400 not on sale.

I think grandparent post meant 1920x1080. Depending on your needs (no adaptive
refresh rate, no color accuracy, only one HDMI input and one VGA input, no
rotating stand, no vesa mount), you can get a 27" monitor for about USD 100.
For most programmers, two 27" 1080p monitors ought to be sufficient.

I'm just amazed I never realized that a 27" 1080p monitor is barely over 80
pixels per inch. I'd imagine that doubles to a respectable 100+ ppi with 1440p
monitors but I suspect it isn't worth paying more than twice the price in my
context.

~~~
gnulinux
I have a single 1080p 13'' monitor (my old MacBook Pro) and that's sufficient
for me. I seriously don't understand what people do with that extra space.
This is enough to see 3 horizontally split files open on Emacs. And you can
double it (6) if you also have vertical split, but then page becomes too
small.

1080 27'' sure sounds nice, but why do you think we need two 27'' monitors?

------
nickjj
27" 4k is going to be pretty tough to read text at 1:1 scaling even with 20/20
vision.

IMO you're better off grabbing something like a 25" 2560x1440 monitor. You can
comfortably read text at a few feet away, even small text with not the best
contrast.

If you start scaling to 150% or 200% with your 4k display you'll end up with
the same screen real estate as a 2560x1440 or even a 1080p monitor depending
on how much you scale up and you'll end up paying a lot more. For example a
really good 2560x1440 monitor will run you about $300-350 (IPS display, low
input lag, good color accuracy, etc.). The first 2 monitors in OP's post are
$1,500 to $1,730.

I did a huge write up on this at [https://nickjanetakis.com/blog/how-to-pick-
a-good-monitor-fo...](https://nickjanetakis.com/blog/how-to-pick-a-good-
monitor-for-software-development) if anyone is curious. I evaluated a bunch of
monitors and ultimately picked a 25" 2560x1440 since it's the sweet spot for
PPI at normal viewing distances while having normal vision.

For programming, having a 2560x1440 1:1 scaled display is going to be a huge
win. You can easily fit 4x 80 column windows side by side in most code
editors. After using that set up for years, there's no way I would ever want
to work on a display that's less than an effective 2560x1440 resolution again.

------
bob1029
For me, 2560x1440@144 on a 27" panel is by far the Goldilocks of writing code.
1080p feels like it's not tall enough with fat editors like Visual Studio, and
anything bigger in size or DPI is starting to get too difficult to keep in
focus.

I do also have a separate workstation with a 4K 43" monitor that I use
exclusively as a standing setup. The monitor is fairly far back, but still
requires me to move my head slightly to view the extents. This is the
workstation I typically use for daily stand-ups so that I can have a ton of
relevant information up all at once. I will also do some large-scale codebase
reorganization efforts on this setup as I can have 3-4 solution explorers
side-by-side and still have 90% of my monitor remaining for code editor
windows, debuggers, explorer windows, browsers, etc. That said, this setup is
tiresome if you are trying to laser focus on a single area of the codebase for
an extended duration.

I will typically alternate between these setups throughout the day as
appropriate.

Also, I do appreciate the color accuracy of the 4k 43" monitor (IPS/8-bit). I
don't do a lot of artistic work, but when I am trying to see if a certain
shade of grey makes sense for a element relative to its contents, having an
accurate monitor really helps pick a good color code. With TN you get much
more 'coarse' results and its hard to find a good middle point.

~~~
zedpoeticbits
I am considering a 43" 4K monitor. Which one do you have, if you don't mind?

~~~
bob1029
LG43UD79

~~~
zedpoeticbits
Thanks

------
fossuser
"Now, it might be tempting to use, for example, 1.5× scaling. That would give
you an equivalent of 2560×1440 logical pixels, which, you might think, is much
better. This is not how you should use it! The idea of a 4k monitor is NOT to
get more pixels but to get the pixel-perfect, high-density UI rendering.
Otherwise, a normal 1440p display would work better. A simple rule to
remember: pixel alignment outweighs everything else. 1440p display is better
at displaying 1440p content than 2160p display is at it."

I like this idea in theory, but I disagree with it in practice.

I use two vertical 4k displays (specifically: Dell U2720Q) and I use them at
the native resolution.

This is because I want IntelliJ to take up an entire vertical display where I
can see a lot of code without having to scroll. I can also divide the other
display into two halves (horizontal line, one window on top and one on bottom
- I use a third party app from the macOS app store called magnet for this).

I can appreciate the smooth fonts, but the screen real estate is more
valuable.

I guess all of this is to say that the best option for me would be an 8k
display at 2x scaling. A 4k display at 1080p though isn't worth the trade-off
(I'll take lower quality text for the additional space).

They do say right after that that using the 4k display at native is also fine,
so maybe it's not an issue?

------
purplezooey
Ever since I saw that picture of Linus writing code on a shitty 90s laptop on
his mom's kitchen table I feel like I don't deserve any trifle creature
comforts.

------
young_unixer
The monitor I want:

\- 1440p

\- IPS panel

\- size between 23 and 27 inches, ideally 24 or 25

\- flatscreen, 16:9, normal monitor, no "gamer" or other weird stuff.

\- reasonably priced

I don't live in a rich country so there's not much variety. I got tired of
searching for it. All monitors with more than 1080p I've found are either
"gamer" (which cost 5x as much because of the gamer tax and other features
useless to me, like 144hz, gsync and "3ms response times") type of monitors,
curved screens, ultra wide screens, etc.

~~~
avalexandrov
I think this is the monitor you're looking for: [https://www.dell.com/en-
us/shop/ultrasharp-25-usb-c-monitor-...](https://www.dell.com/en-
us/shop/ultrasharp-25-usb-c-monitor-u2520d/apd/210-avkg/monitors-monitor-
accessories)

~~~
young_unixer
Thanks for the suggestion.

------
vicedvin
I find your opinion very subjective. In my experience, I remember that in
2013-2014 when i switched from MBA to retina MBP i experienced the wow effect.
But recently MBP fall off and screen got broken. Now using 720p 20” monitor
and have not noticed significant difference. Mostly programming, as in reading
and writing text, be it in text editor or shell. It is as good as before. Less
screen estate, right, it does not affect my workflow much.

------
pkulak
The problem is that a little more can be waaay worse. You need to to double
your pixels to avoid turning everything into a blurry mess, which means 200+
PPI. But there are literally no monitors like that not made by Apple. A 4K
monitor at 32 inches is 138. No thank you. Gross. There actually are some 4K
22-inch monitors, but I don't want a 22-inch monitor.

I'm just going to stick with my 1440p 27" monitor until someone can sell me
something better.

------
kurkosdr
If I may add a couple of things:

1\. Buying 4K TVs as large PC monitors is a dangerous game, despite all the
sweet lies about “convergence” supposedly enabled by HDMI. You have to make
sure the TV has a true “PC mode” with true 1:1 pixel mapping and no
“subsampling” (if it exists, it is usually enabled by labelling the HDMI input
as “PC“), that you weren’t conned with an RGBW pixel format (or any non-RGB
pixel format for that matter), and that the panel can display 6500K colour
temperature in "PC mode". Also please note some TVs with IPS panels have a
pronounced chevron pattern in their subpixel layout instead of the traditional
3 vertical stripes, which makes ClearType look slightly different even in true
"PC mode" (I personally find it cute and inoffensive, your mileage may vary)
and that all 4K VA panels have bad horizontal viewing angles.

2\. Make sure your RGB levels match. That is, use “full range” for RGB 0-255
displays (PC monitors) and “limited range” for RGB 15-235 displays (TV
monitors). Sometimes the drivers don’t get this right. If it’s wrong, and you
get “limited range” on an RGB 0-255 display, you will get washed out blacks
instead of inky blacks.

------
sdflhasjd
More than 5 years ago, I bought two 24" 4k monitors from Dell (UP2414Q), with
a PPI of ~180 (and sitting over a foot away), they give me a retina experience
with my desktop. They were an early product from Dell, so despite great
picture quality, they are buggy and have large bezels.

This week, one of them failed. While looking for a replacement, I was
expecting the range of 24" "retina" displays to have improved somewhat.
Surprisingly, it seems the opposite has happened. Dell no longer make an
Ultrasharp range 24" monitor with 4k, everything is now 27".

Although they do make a 'P' series equivalent, it's an old model released
around the same time as my original ones, with similar bugs and early hardware
(no 4k 60hz HDMI)

27" is sadly too large for me to fit two side-by-side on my desk (and even if
they would fit, 27" is too large for my liking).

Now I'm left struggling to find a replacement, I may have to live with
downgrading to something like 120 PPI with a 2k (1440p) 25" display.

It's sad that retina really hasn't caught on much in the desktop monitor
space, we should really have 3k as a standard resolution for smaller desktop
monitors.

~~~
zaphoyd
FYI: The Dell P2415Q does have 4K60hz over HDMI as long as you get one
manufactured recently. They did a silent spec bump in 2017 or 2018 or so. I
use a number of these and while they are still slightly buggy, I've found them
much better in that department than the UP2414Q.

------
ajross
Something's off with the analysis here.

> According to my research among programmers, 43% are still using monitors
> with pixel per inch density less than 150

OK...

> Let's take Consolas [...] at 14px, which is default in VS Code

A 14 pixel font at 150 DPI is a 6.75 point type size in real world units! Good
grief, I can't read that anymore. Frankly I doubt I could have read it in my
20's. I certainly never would have wanted to code in it under any
circumstances.

------
ryanmjacobs
Yeah... I've got a small anecdote. I find that I write better code on lower
resolution screens. Right now, I'm on a 1440x900 display at 75 dpi. And it's
annoying, for sure. But it keeps my code concise and well-organized. It forces
you develop your code around a mental model, instead of a visual model.

I find that at 4K, my code starts spreading out -- horizontally and
vertically. That's fine when you're writing it, but when you come back to it
months later, it's so hard to comprehend that much information all at once
IMO. It's weird. And I might not be explaining it properly.

If you were to write your code on an 800x600 display, you would make sure that
every function did not go over 800 vertical pixels. It would probably give you
a headache if you did.

I can look at code and almost immediately tell that it was written on a 4K
display. Oftentimes, everything is spread out and chained obtusely.

Completely different game for web dev though -- having the browser side by
side with your code helps.

...

Also, 4:3 is awesome for code. All I need is two, side-by-side panes of
80-column text :)

16:9 has me glancing left to right too much.

~~~
SSLy
You can change the font size everywhere.

------
buserror
I think that it's a long article to just say that a 1) good 2) big screen, or
two, or 3 is a good idea. The bigerrer, the beterrer.

I wrote a lot of code on Mac classic format (512x342) using Monaco 9 -- it was
great; next step was 640x480, and 800x600, and then (for a rather long time)
1024x768.

It was still perfectly fine, as (at the time) Apple screens were some of the
best, nice crisp, clear with good contrast. I didn't feel particularly
handicapped because of the screen real estate, you just adapt to what you
have.

Now my main machine is Linux with 2*32" 4K screen side-by-side, and about 60cm
from me. I use Liberation Mono 17pt as a terminal font, and 9pt is just
microscopic on the screen.

And guess what? I still wish I had a bit more screen space sometime! :-)

One thing I always tell more junior developers who ask me about what the most
valuable piece of information about programming in my whole career is: Get a
good SCREEN, get a good KEYBOARD, and get a good CHAIR, the rest are just
details.

Oh, also, make sure the screen(s) are perpendicular to any window, and watch
that posture!

</walks off waving is cane in the air> :-)

------
jamescobalt
Talks about 4k at 120hz without mentioning that this _usually_ sacrifices
chroma, making text unreadable. Fine for games, but not for coding. No mention
of Display Stream Compression. No mention of the importance of panel type (IPS
vs VA vs TN). No mention of the various HDR standards (and especially how
HDR400 isn't really HDR). Suggests getting a 5k, 6k, or 8k monitor while also
insisting you get a 120hz display (it's going to be one or the other for the
next couple of years). Suggests monitors whose adaptive sync feature only
works on Nvidia GPUs.

The author seems to know a lot about font rendering but he doesn't seem to
know that much about monitors despite having VERY strong opinions about them.

I suggest people wait till the new year before a big monitor purchase. EVE
Spectrum is slated for Q4 of this year. If it delivers on its promises, it'll
be the best value for what you get. And if it doesn't, you can fall back on
the ROG XG27UQ or its contemporaries - assuming your GPU can support DSC, 4K,
and 120hz.

------
dentalperson
The other side to this is that it may also be time to upgrade your eyes (with
glasses, by getting your eyes checked). It's more likely this is true if you
don't think it's time to upgrade your HD monitor.

Somehow I went for years without realizing that I just needed glasses. I could
read everything before, but after getting the glasses everything was suddenly
unnaturally crisp.

------
boromi
What's the point of 120Hz if your just programming?

~~~
vardump
Can read the content clearly while smoothly 120 Hz scrolling or moving windows
around.

~~~
kazinator
You can do that just fine at 60 Hz, if there is no tearing (all screen updates
appear during vertical sync).

You can read scrolling movie credits just fine at 24 FPS.

~~~
mrguyorama
>You can read scrolling movie credits just fine at 24 FPS Honestly I actually
struggle with that. But working and gaming at 60hz is still perfectly fine
with me. I even purposely run my Valve Index VR headset at 90hz instead of
120hz because I don't need the extra frames

~~~
smabie
What do you mean you "don't need them"? The jump from 90hz to 120hz is huge.
The jump from 60hz to 144hz is life changing.

------
seniorsassycat
Bitmap fonts will look perfect on low res displays. I've been using a 1366x768
laptop with terminus font for six years.

~~~
santoshalper
Well, I don't know what DPI you are using (perhaps the display is tiny?) but
it sounds low. Of course, we all used low DPI displays for years and were
happy with them because we did not know better. Now that I've become
accustomed to higher DPI, low DPI screens look like crap.

It's the exact reason I am actively avoiding using 120 or 144hz screens. I
don't want to get used to it and then need to ditch my current display. For
several years I bounced back and forth between gaming on a PC and PS4. The
whiplash between 60 and 30hz was absolutely brutal - but I never noticed it
back when all my displays were 30hz.

So in the end, I am not sure what advice, if any, I am trying to give you. I
love my huge, high-DPI screen, but if you are happy with what you've got, you
might seriously consider sticking with it for a while.

------
cannam
I came to the comments expecting lots of (irrelevant, I know) complaints about
the painfully bright yellow background of the article - but nobody has
mentioned it yet. Is it serving different colours to different people, or am I
unusually sensitive to this colour? I found the page unreadable without
switching to reader mode.

~~~
Krasnol
I have the same issue.

It was painful to look at.

------
ShinyObject
This is actually a rather bad time to buy a high-end monitor because if you
get one right now it will likely be outdated by the end of the year.

If you want a monitor that can do all of the cool things at once (2160p, hdr,
120Hz+, freesync) without using hacks like dual cables or realtime lossy
compression, then you need both a video card and a monitor that support hdmi
2.1 or displayport 2.0.

Currently there are very few monitors that support the newer hdmi/dp standard,
and there are no video cards that do. However both of the new consoles
scheduled to be released later this year (xbox series x, ps5) will support
hdmi 2.1. This also implies that upcoming discrete gpus will support this as
well.

This means if you want a monitor that can do everything, right now there are
only a tiny number of screens, and no video cards. Barring any delays, the
situation should be very different in a few months.

~~~
fomine3
Agreed. I expect more products released with new interfaces.

BTW current GeForce and RADEON supports DisplayPort DSC that could enable more
resolution / frame rate by compression theorically but I don't know any
existing displays.

------
zrm
Does anybody make a reasonable 16:10 aspect ratio 4K monitor yet? Because I'd
buy it, but I can also stick with 2K forever if nobody wants to shut up and
take my money.

------
alyandon
I'd possibly consider upgrading from my 2x Dell 2408WFP setup if modern IPS
panels weren't such absolute garbage with respect to backlight bleed compared
to the IPS panels made 15 years ago.

Even my old laptops with IPS panels look better than newer ones. I was in the
market for a laptop last year and returned 3 different laptops with IPS panel
displays from 3 different manufacturers because every single one of them had
backlight bleed that was so bad that it was plainly visible even in a well lit
room. The last company pushed back trying to claim that that level of
backlight bleed was considered "normal" but eventually refunded my purchase
once I threatened to do a chargeback through my credit card company.

Do there actually exist any companies anymore that produce decent monitors
with minimal backlight bleed?

------
Zardoz84
As a Guy that does back&front programming and gaming on the same PC, my
subjective opinion it's that having a decent color gamut it's important. I saw
details that on a old monitor (with a worse color gamut) dissaper or looks
worse. Also, a ultra wide screen gives a lot of screen space to handle a code
editor/IDE and a browser on the same screen at same time. At some point on the
future, I would replace my secondary monitor following this priorities:

* Correct color gamut. * Minimal resolution of 1080, better if is 1440. * Minimal refresh rate of 60hz, better if at least is 75hz.

This push me to get a LG IPS monitor like the ultra wide main screen that I
have now.

------
adamch
Wow. He was right. I just changed my 15 inch Macbook Pro scaling away from
default and WOW the text is clear. I also just turned my second monitor onto
144hz and WOW the animations are fast. My work from home setup just got
significantly better.

------
egypturnash
Huh, 4k screens have come down. I could replace this eight-year-old 24" 93dpi
Dell screen with a 27" 183dpi HP screen for $540 (plus shipping). I think
that's around what I paid for this one when it was new.

(why that HP screen? First thing on Wirecutter's 'best monitors' list, which I
think was how I chose the one I'm looking at right now.)

I guess it'd be nice to turn off Illustrator's glitchy anti-aliased previews
for good. But this monitor still works perfectly fine. Call me when I can get
a color e-paper display with a 60hz or better refresh rate, I'm increasingly
tired of staring into a giant light all the time.

------
morpheuskafka
Wow, I just turned off the font smoothing setting on my 4K monitor and the
test is dramatically easier to read. Really surprised that Apple doesn't
provide a better definition of this feature/turn it off on HiDPI displays.

------
asdff
4k monitors are still so much more expensive and have noticeable costs on
laptop performance at least. When I got a macbook air with USB-C, I was
initially planning on getting two 4k monitors. However, that would mean
spending ~$500 to get a 27" monitor, $120 for a thunderbolt adapter than can
drive both displays at their proper refresh rate, and >$200 if I wanted an
adapter that can also charge the laptop.

All told, that's the total price of the laptop just in periphery. I ended up
going with a $30 usb-c adapter and some ~$100 dell 1080p screens, and I'm
happy as a clam.

------
xyst
Personally, my takeaways from this article are to disable font smoothing. I
disabled it on my MBP and I like how the default fonts are rendered now. They
appear less bright and more crisper. I haven't changed these settings on my
MBP since I got it, so it's likely enabled by default. The rationale behind it
makes sense in low resolution cases, but to have it enabled by default on rMBP
doesn't make sense to me.

However the recommendation to reduce the resolution scaling option is a no for
me. The text is wayyyy too big for me and I lose out on the screen real
estate.

~~~
elondaits
That part of the article does not apply to my 13'' 2015 MBP... I checked and
it already uses 2x resolution by default. Perhaps it applies to larger screens
or newer models.

------
Const-me
Agree about resolution.

I was an early adopter. Didn’t happen voluntarily though, my client wanted me
to develop embedded Linux software that drives a 4K HDMI output, so I was
kinda forced to upgrade. Never looked back. When that monitor broke just after
it’s 3 years warranty expired, I’ve bought very similar one with the same
specs from another brand, 27” 4k IPS.

However, I think 120Hz is overkill for programmers. The price/performance
proportion is not great either, my current BenQ PD2700U is offered for just
over $500 in the US, while the monitors recommended by OP are above 2 grands.

------
dkersten
I use a 32" 1440p monitor and while I would have liked to get a 4k one
instead, even with getting tax back by declaring it as a business expense
(since I use it primarily for work), 4k still would have been more than I was
willing to spend on it right now. I don't want to use a smaller monitor (might
even like one size larger) and tend to use large fonts anyway since my
eyesight isn't super great, so 1440p seemed like a much better fit for me,
given the price of a good 4k one. Maybe in a year or two I can upgrade.

------
jiggawatts
This is a fantastically well written blog article, and there needs to be more
attention paid by the industry to issues such as these ones raised by the
author.

I've generally found that all the IT vendors have an incredibly lazy and
sloppy attitude to imaging of all types, not just display scaling or font
rendering. "The pixels went on to the display! Good enough! Ship it!" is the
attitude.

Compare this to televisions, where there 120 Hz is common, 8K is a thing now,
wide gamuts and HDR are not only supported but _consistent_ and even
_calibrated_ out of the box! Dolby and NetFlix in particular have put
significant effort into making sure their customers see the original artistic
intention, not some faded or garish mess.

In the PC, Mac, and smartphone world it is literally impossible to send
someone either a still image or video that'll look even vaguely correct. It'll
be too bright, or too dim, faded, or worse: the colours will be stretched to
the maximum display gamut. Photos of faces look like zombies or clowns in full
makeup.

HDR is simply impossible to use for still image exchange. No matter what some
random JPEG committee is working on, don't even _think_ about such nonsense as
sending someone a HDR file, it will practically never work, no matter what
display the recipient is using. Or even just an SDR 10-bit file. You're
wasting your time.

To give you an idea of how _insanely bad_ this is, for years and years if you
enabled HDR for your display under Windows 10 it would simultaneously wash out
the desktop colours, dim it until it was way too dark, and then blur
everything on top of that so that your 4K monitor is pointless.

Clearly _nobody_ actually tested HDR support at Microsoft. After they received
complaints, both to their tech support and in public... they did nothing. For
years. YEARS.

I just find the whole thing incredibly sad. It's 2020. The future! The
physical technology for nearly perfect display output has been available for
years, but... no. You just can't have it, because the software won't allow
you.

It doesn't matter what display you buy. You can't have HDR, or 10-bit, or
colour management on the desktop, or your browser.

Firefox has had full ICC colour management for many years now. It's off by
default. Internet Explorer truncates everything to sRGB even on a wide-gamut
screen. Safari stretches colours for un-tagged images.

The only HDR and colour-managed thing you can reliably do on a PC is watch
NetFlix. That's... it.

------
itnAAnti
I had a 34" 4K monitor in 2019, and it was a mess in Windows 10. I sold it and
picked up an ultra-wide, 38" 3840x1600 (~109 PPI) and I couldn't be happier. I
agree that a "good" monitor is a necessity, but I disagree with OP that "good"
= 4K 120 Hz.

With any legacy application, and even many modern ones, UIs were either fuzzy,
or tiny, or a mix of the two, even with all of the Windows per-program custom
high-DPI settings tweaked. The UX was not consistent enough to be enjoyable,
so I ditched it.

~~~
amarshall
For what it’s worth, the “fuzzy” version is usually about what you’d be
getting on a display if it wasn’t HiDPI. Also don’t forget to exclude the
rarer, but still happens scenario where the UI is scaled twice for some
reason, so everything is huge.

------
opan
Bitmap fonts look crisp and clear at small sizes. I find I enjoy them much
more than a lot of the fonts in the article. Try out Terminus or Tewi at 9pt
if you're curious. They make Fira Code and Inconsolata look blurry by
comparison.

I recently changed my terminal emulator from Termite to Alacritty because
Pango broke bitmap fonts, and Termite uses Pango. Unfortunately several other
things I use are still using Pango, and I've had to switch to subpar fonts in
waybar and a few other spots.

------
musicale
I used non-antialiased bitmap fonts on a low-resolution display for terminal
windows for the past 3 months, and it was great. Now I've switched back to 5K
and it's fine, but there really was nothing wrong with bitmap fonts.

I also tried a vintage "mechanical" switch keyboard and mouse, just for fun. I
think modern macOS doesn't work quite as well with a mouse because of the
weird, narrow, disappearing scrollbars (though a mouse scroll wheel or scroll
ball can help with that.)

------
sholladay
Having read the article, I'm still not able to fully grasp why the old
techniques for improving text legibility can't be "scaled" with screen
resolution. We had imprecise pixels before and we still do, in the sense that
they are very finite. I totally get why smoothing and other tricks aren't as
necessary as they used to be, but I don't get why they have to be detrimental
on Retina displays. At what resolution does this reversal happen and why?

------
rubatuga
Good to know details:

In 2018, as the article says, macOS stopped using subpixel antialiasing across
all Macs. On low DPI screens however, the loss of subpixel antialiasing made
fonts look blurrier, and look smaller and less dark. This occurs on high DPI
screens as well, albeit to a lower degree. To bring low DPI screens up to the
same level of "darkness", the behaviour of "Use font smoothing when available"
was changed to simply makes fonts bolder. This is a plus for people who have
difficulty reading the screen.

Furthermore, the author seems to have an obsession around "pixel-crisp" fonts,
during the section about the correct resolution for your display. What macOS
does is render at 2x resolution, then scale down the image to the monitor's
native resolution. Looking back, this was an amazing choice made by Apple, and
has allowed for an amazingly bug-free high DPI experience. Even today, Linux
and Windows have serious issues with high DPI. 99% of macOS users don't notice
that their desktop is not running at native resolution, nor would they care.

If the author just focused on monitors, this article would be a lot better.

Also, 120Hz might be nice for a desktop, but if you are plugging in a laptop,
you will notice your power consumption double. For battery life, choose a 60Hz
monitor.

------
akersten
It's interesting to see an argument against ClearType/font smoothing, but not
anti-aliasing in general. They both change the font from what the designer
intended, so why is font smoothing so much worse?

I actually disable anti-aliasing in games on my 4K display - not just for
performance, but because anti-aliasing genuinely isn't needed at that density.
I think the author could go all the way here too, since after all, AA is just
another crutch for low-density displays.

------
yellowapple
I agree with the problem and the severity of it, but I disagree with the
proposed solution. Don't get me wrong, the more screen real estate the better,
but I recently switched to GNU Unifont as my primary font and _wow_ is it nice
and crystal-clear at 12pt. Bitmap fonts might be "ugly" and inflexible, but
they do seem to entirely sidestep boneheaded antialiasing strategies (provided
they're configured to not be needlessly antialiased).

------
defgeneric
If I could just get an affordable 4:3 flatscreen monitor with any pixel
density I would be happy.

------
localhost
I'm currently using a Sony 43" 4K TV [1] as my monitor. It supports
uncompressed 4:4:4 chroma subsampling [2] which makes for a huge impact on
visual quality. It's also quite inexpensive. I find that the height of the
monitor is about as big as I can tolerate. I certainly wouldn't mind higher
resolution, but that doesn't exist at this size.

LG now has a 48" OLED TV [3] that supports a 120Hz refresh rate. I'm looking
forward to trying that out. Either that or the new Samsung Odyssey G9
ultrawide [4] which is about the same price. It's also 240Hz but with VA
pixels (which apparently aren't _that_ bad). The G9 will be better on the
vertical axis (not too tall) given its size. The extreme curvature is also
interesting - not sure about that yet.

[1] [https://www.amazon.com/Sony-KD43X720E-43-Inch-Ultra-
Smart/dp...](https://www.amazon.com/Sony-KD43X720E-43-Inch-Ultra-
Smart/dp/B071FFFRXM)

[2] [https://www.rtings.com/tv/learn/chroma-
subsampling](https://www.rtings.com/tv/learn/chroma-subsampling)

[3] [https://www.lg.com/us/tvs/lg-oled48cxpub-
oled-4k-tv](https://www.lg.com/us/tvs/lg-oled48cxpub-oled-4k-tv)

[4] [https://www.samsung.com/us/computing/monitors/gaming/49--
ody...](https://www.samsung.com/us/computing/monitors/gaming/49--
odyssey-g9-gaming-monitor-lc49g95tssnxza/)

------
Ycros
On lower dpi monitors I use bitmap fonts with no smoothing for coding, they
are crisp. See
[http://upperbounds.net/index.php?menu=download](http://upperbounds.net/index.php?menu=download)
for a bunch of these fonts (I use Proggy Clean with slashed zeroes and bold
punctuation). Note that you may have to play with your font size and turn off
any antialiasing in your editor to get it looking right.

------
stephc_int13
I am using the same three Dell Monitors since 2008, they are still working
perfectly (1680x1050) and in my opinion the text rendering with ClearText
subpixel AA is excellent on both Sublime Text and VSCode.

Of course, on macOS that's an other story, they didn't always had ugly fonts,
but that's been the case since a very long time now and that prevented me to
switch more than once (I still have use macOS in a VM to build for iOS)

~~~
asdff
I'm not sure which version exactly, but somewhere along the line apple just
ruined the experience for everyone without a retina laptop.

------
WatchDog
If you want a bit more control over your monitor resolutions on macOS, I
recommend RDM[0].

It lets you change a few settings around DPI and refresh rate, that arent
otherwise available via the regular UI.

Not all desired modes will work, it's still fundamentally limited by macOS's
crappy display support, but its better than nothing.

[0] [https://github.com/usr-sse2/RDM](https://github.com/usr-sse2/RDM)

------
laurentdc
I don't know. Text is important to me, but high refresh rate is much more.

Even scrolling text or dragging windows around at >120 Hz is a game changer.
Going back to the MBP display gives me instant headache until I re-adapt. It's
hard to describe, it's like all of a sudden everything has a very distracting
input lag.

Now I just need to wait for a 4k 144Hz IPS that can be driven via MBP (or just
get what's on the market and run eGPU)

~~~
JohnBooty
At the end of the article, the author mentions the Acer Nitro XV273K, which
Macs can drive at 4K@120hz (though you have to jump through a hoop in the
Displays prefs on each boot)

------
Havoc
I bought a 4k screen for same programming rationale. Deep regret.

From CRT days I know I can see 60hz but not 75hz. I had stupidly assumed that
modern 60hz with freesync would be ok. It's not. Especially @ grey colour for
some reason (Asus screen).

The fact that I was coming from a 120hz laptop screen didn't help the
situation either.

Anyway...so I'm buying a 144hz 1440p gaming screen next. Screw this resolution
game...I need flicker free for my sanity.

~~~
smabie
I have a Dell S2716DG [[https://www.dell.com/lv/business/p/dell-s2716dg-
monitor/pd](https://www.dell.com/lv/business/p/dell-s2716dg-monitor/pd)] and I
absolutely love it: 1440p, 27in, 144hz, 1ms response time (on fast mode), and
g-sync. It's great for games and looks very classy. Also the price isn't bad
at around ~$500. I would highly recommend it.

~~~
Havoc
Nice screen. Seems to be quite expensive UK side unfortunately. Almost 750 usd
on amazon

------
alwaysanon
To his comment about truly disabling Font Smoothing on Windows I think that,
in addition to disabling ClearType, there is also a font smoothing setting to
turn off as well. You can disable it by: * Open System in the Control Panel *
Click on Advanced System settings * Go to the Advanced tab * Click on Settings
button under Performance. * Uncheck the box for Smooth edges of screen font. *
Click on Apply

------
gauchojs
Conclusion after reading tens of comments here: no way to know if I'm young
enough for 27" 4k (which is finally affordable in my country) or if I should
go with 2560x1440 at 24", which feels outdated in 2020...

And that isn't a great step over my working 22" 10-year old 22" FHD Dell.

How long until we have affordable 'Macbook like' crispness as 22" or 27"?

(Also Linux still sucks with fractional scaling.)

~~~
fenesiistvan
I am on a beach, read 500 comments, totally confused about what monitor to buy
:)

------
rs23296008n1
The guy needs to chill. Relax. He's got a point about PPI but don't get too
hung up on pixels. Your eyes are your guide.

For my purposes a single 4k 43" monitor is fine. Its not ultrawide either. No
need to upgrade. I have another identical one at the other place I "nest". Its
like using 4x HD (1920x1080?) monitors but no bezel in between. Very
comfortable and the text doesn't hide in obscurity. Plenty of space for
everything and doubles as a excellent TV when needed.

Personally I think most of the trouble people have is due to staring at
screens too long. You shouldn't be spending more than an hour staring at a
screen continuously anyway. Even thats probably excessive. Take a break.

Its kind of like those people who's eyes are so tired they change their
preferences to light text on a light background. Ummmm. No. Not good.

But I'm an old programmer, amongst other hats, so what would I know? I
remember when 640x480 was crystal clear and so sharp and 1024x768 was "wow"
extra real estate!

But yeah. Go higher res and bump up the physical size at the same time.
Completely addictive if you have the desk space.

------
h2odragon
That's a lot of words to say that vector fonts suck for less than 24 px sizes.
use bitmaps; there's lots of them to mitigate that problem. It was not that
long ago that 640x480 was a screen and it sucked but things got done.

My last (almost) 2k CRT last just long enough that i could afford a 4k LCD
when it died. I'm pleased with a pair of these but once MOAR PIXELS become
affordable i'll buy them.

------
zzo38computer
Really the problem is they are using vector fonts; I use bitmap fonts in my
xterm, and it is not as bad as the vector fonts used in Firefox. The text is
not blurry if you are using bitmap fonts. How can I force Firefox and other
programs to prefer bitmap fonts? (What I have tried, just results in no text
being displayed at all.)

(High DPI is probably useful if you are doing a lot of print previewing,
though.)

~~~
GoblinSlayer
Uncheck the checkbox "allow sites to use their own fonts", then Firefox will
use configured fonts.

~~~
zzo38computer
I did that. But I still can't seem to configure it to use bitmap fonts. I got
it to use bitmap fonts for the tab titles, the location bar, and the status
bar, but it won't use bitmap fonts for anything else.

------
novok
The 4k 120hz monitors he is recommending all have a fan noise issue, since
they all use the same basic design with a dinky small fan in the back and I
think panel. I did a deep dive on it myself and decided in the end not to do
it because of the fan issue:

[https://www.youtube.com/watch?v=o_vUqe7Q6vw](https://www.youtube.com/watch?v=o_vUqe7Q6vw)

~~~
Sephr
The Acer monitor's fan is louder as it also includes eye tracking tech which
results in increased heat output.

------
s9w
My take on having used quite a variety of displays: 1440p with a high (120,
144 etc) refresh rate is the way to go.

With 4K you can lose a lot of performance especially since they are often
paired with integrated graphics. Also what's the point of a higher pixel
density when it's higher than you can resolve? Meaning if you have to zoom
that's a sign that you get nothing more from increasing resolution. Also
despite claims to the contrary, a lot of software is still not well suited for
4k resolutions.

High refresh rates are completely underrated as it's at the same time a meme
but also a widespread false belief that human eyes have some kind of limit
that is being satiated by your 60hz displays. Scrolling code on 144hz is
_smooth_.

Bonus point that isn't mentioned in the article: HDR is a meme, don't buy into
that crap.

Also: CRTs are criminally underrated and still unfairly judged. We lost
something from that era. Colors, black levels, subjective image quality and
input lag have still not recovered from the peak of display technology in the
year 2000 or so.

~~~
cthalupa
>Bonus point that isn't mentioned in the article: HDR is a meme, don't buy
into that crap.

I'd wager more people are going to notice the difference between HDR and SDR
than they would 4k vs 1080p. It is by far the biggest picture quality upgrade
I've seen since going from SD to HD.

The problem is HDR sucks on LCDs, so it's largely useless on PC. FALD isn't
enough to limit the halos, even with a ton of zones. I own the PG27UQ
mentioned in this article and I hate it.

I'll probably just buy an LG CX 48" for my next monitor.

~~~
s9w
I mean sure HDR is impressive, but for me it's always like 3D movies.. nice
for 5 minutes and then you see the flaws. Dimming zones are a really crappy
way to get around the fundamental display tech limitation like you said. But
OLED has insane problems with burn-in. They officially don't exist but there
are endless reports of this happening. Although I can't speak with high
confidence about that. In general the major problem with non-OLED displays are
the black levels. Going for HDR was a silly PR move.

~~~
cthalupa
> I mean sure HDR is impressive, but for me it's always like 3D movies.. nice
> for 5 minutes and then you see the flaws

I mean, it basically "just works" on OLEDs. I've watched thousands of hours of
HDR content now and maybe 2-3 hours of it was poorly done and distracting. vs.
3D, which I can never sit through at home. (Though, I don't mind well done 3D
in the theater)

>But OLED has insane problems with burn-in. They officially don't exist but
there are endless reports of this happening. Although I can't speak with high
confidence about that.

I've got two OLED TVs, one four years old, the other two. Neither have had any
noticeable burn in, though the older one does sometimes have temporary image
retention if I've an extended amount of time playing a game, or watching a
channel with a static logo, etc.

I am not one of those people that sits there with MSNBC or CNN or Fox or
whatever on all day every day with the logo in the same place, though. My OLED
usage goes Movies > TV shows > Twitch > Youtube > everything else.

Until a better tech comes out, I'm all in for OLED, even for computer
monitors. You won't see me purchase another standalone LCD screen for anything
that I'll be caring about picture quality on.

~~~
s9w
Ultimately what we really need is micro-led. But that seems to be difficult.

------
chromaton
These monitors and most of the ones mentioned in the thread are tiny. Whatever
happened to the dream of the full-wall display? I really like my 43" 4K LG
monitor, but apparently this type of monitor is rarely produced these days. A
quick Newegg search shows exactly 1 monitor with similar specs and it's more
than twice the cost of what I paid for mine two years ago.

~~~
buserror
I use 2 _32 " 4K monitors side by side; Before that I had 2_30" 1920x1200 (not
1080p!) side by side too; to me it's sufficiently "a wall" that I put them
about 80cm from me, each angled a bit inward.

Quite frankly anything bigger and you get a kink in your neck by having to
move your head so much!

------
wwee
I find this article resonates with me on many points. Particularly, high DPI
and high refresh rates are not gimmicks, they have meaningful impact in making
the user experience better. It's something that Apple has been pushing for
years, from Retina to ProMotion. Sure, not everyone will appreciate the
difference (Some people are happy with Macbook Airs, some need Threadrippers),
but that doesn't mean it's not a meaningful difference for many users.

One area that I disagree though:

> The idea of a 4k monitor is NOT to get more pixels but to get the pixel-
> perfect, high-density UI rendering. Otherwise, a normal 1440p display would
> work better.

Not in my experience. Sure, 2x scaling is ideal for sharpness. But there's a
tradeoff with screen real estate. I regularly switch between a 1440p monitor
at work to a 4k monitor @1.5x scaling at home. Fonts are still noticeably
sharper on the 4k monitor.

4k @1.5x scaling is not _pixel-perfect_ , but definitely _sharper_ than 1440p.

------
Androider
3840 x 1600 is the perfect resolution for programming in my opinion, and is
now starting to become more widely available in 38" wide form factor
(equivalent to a "normal" 32" but wider). At this size, you can finally
comfortably have 2-3 normal-size windows side by side: your editor/IDE, your
browser/app, and one third for documentation, without having to go multi-
monitor. The 1600p vertical resolution is fantastic for coding. If you're
interested in this size, LG 38WN95C is a great one just starting to become
available that hits all the marks: Thunderbolt, 144HZ refresh rate (including
G-Sync and Adaptive Sync), IPS panel (must) and looks "normal" without any
gamer aesthetics or RGB lighting.

Unfortunately it's going to be a very long time before we get high-dpi
equivalents. 4K, which is frequently 1080p equivalent in terms of workspace,
is just not doable after you've been using 1600p.

------
gwbas1c
Here's the thing: Reading something extremely close to your eyes is unhealthy
and causes eyestrain. (Your eyes work harder to focus on closer items.)

If your monitor is so close that you can actually see the details described in
the article, you're in the eyestrain land.

I keep my monitors far enough away that I don't see these details anymore.

~~~
GregorBrandt
My optometrist tells me this is not true. Exercise does not hurt your eyes.

~~~
chiefsucker
From my limited understanding not the distance is the problem, but the
constant focus at a point in the same distance. That’s the reason why they
recommend to look at distant objects every 15 minutes or so. Starring at a
tree for ten hours straight would have a similar effect, but woods are so
complex that your eyes are constantly refocusing.

------
hareyama
I have been using 42" 4k monitor. It is suitable for some task that require 4
or more windows at same time. For example, development of Field Programmable
Gate Array(FPGA) use more than 4 windows (FPGA CAD, Waveform, HDL editor,
explorer, etc...). It is comfortable that use these windows without switching
of windows.

------
sasaf5
Developer text editors won't benefit from 120Hz monitors. Their scrolling is
not smooth pixel-by-pixel, but line-by-line. Also most developers will
navigate code with search, and won't need scrolling anyways. I would rather
put my cash on a display that renders still images better, and is comfortable
on the eyes.

------
adtac
This is going to sound like I want to eat the cake and still have it, but I
wish there was some sort of balance between real estate, sharpness, and
affordability. A 24in 4K monitor without scaling makes text too small, but at
200%, it's no different from a 1080p monitor real estate-wise (even though
text is much crisper). My current monitor, a 1440p 24in screen at 100%, serves
all of my real estate needs, but leaves something to be desired in terms of
sharpness, so a 5K screen at 200% and similar physical size would be perfect;
alas, there aren't any general-purpose 5K monitors at an affordable price. 8K
screens at 300% would offer similar, albeit slightly lower, real estate at
even better text sharpness, but there's just one such monitor from Dell and
it's ridiculously expensive.

I hope 5K monitors become as cheap as 4K monitors over the next few years.

~~~
Dylan16807
> 8K screens at 300% would offer similar, albeit slightly lower, real estate
> at even better text sharpness

5K @ 200% = 5120 / 2 = 2560

8K @ 300% = 7680 / 3 = 2560

~~~
adtac
Oops, you're right, I knew I should've just used Python instead of trying to
do that in my head :)

------
biosed
Using 55" LG OLED @4k as my main display, nothing compares.

~~~
_JamesA_
Have you experienced any of the reported OLED burn-in issues?

I would love to upgrade to an OLED but my screens are on at least 16 hours a
day with semi-static content. That's a hefty price to take a chance on burn-
in.

~~~
biosed
Its the C9 and I have all the anti burnin features turned on, I have no
issues, you can see the tv moving the whole screen a few pixels every so
often.

------
freetime2
I just bought two new monitors this year, and went with 24 inch 1440p. My
reasons were:

* At the distance I sit they look fine to me. In particular I don’t notice any jarring loss in resolution compared to my MacBook retina display which sits next to them.

* The “native” text size on a 24” 1440p monitor is perfect for me. Any smaller and I would struggle to read certain things.

* I was concerned about HiDPI scaling on linux. Fractional scaling was still “experimental” at the time I looked.

* I actually care more about other factors like the stand than 4k resolution, and would prefer to put my money towards that.

* When considering buying a single large monitor and running at native 4k resolution, I had concerns about screen sharing. Would people on smaller monitors have trouble with scaling? Also, having two separate monitors helps me with window management.

------
edanuff
Echoing some of the comments here, part of this depends on whether you’re
using MacOS or not and what else you’re using the monitors for. I use 4K
monitors for both Mac and Windows and I stick close to the Apple recommended
HiDPI approach which means your 4K display isn’t larger than 24” and your 5K
display isn’t larger than 27”. YMMV but most people who have a bad 4K
experience are trying to do 4K at 27” or greater. Due to DisplayPort 1.2
bandwidth limitations, the 5K 27” monitor market never got - great range of
options. You either have the Thunderbolt-based monitors that are mostly for
use in the Mac ecosystem (and only some Mac models) or a few expensive dual-
cable DisplayPort 1.2 options or DisplayPort 1.4 options with an assortment of
compatibility issues.

------
ajtjp
Some people have mentioned that once you use 4K, you can't go back. I guess
I'm in the group that has used 4K (or 5K, in my case), and found it easy to go
back.

At work, we have these 5K, 27" LG monitors:
[https://www.apple.com/shop/product/HMUB2LL/A/lg-
ultrafine-5k...](https://www.apple.com/shop/product/HMUB2LL/A/lg-
ultrafine-5k-display) . Don't get me wrong, they're great, but... I don't feel
like I'm missing anything when I go home and use my Dell U2412M (1920x1200,
24"). And working from home the past three months hasn't tempted me to upgrade
my monitor, either. 1920x1200 gives me a good amount of real estate at 100%
scaling, and that's the main thing I need.

Now it is true that if I look at the font in this text box in Vivaldi, the
rendering kind of sucks; Firefox is a bit better. There was definitely a time,
around when I switched to Windows 6.x from 5.x, when I was passionate about
this, and dislike for the Windows 6 font rendering was one of the reasons I
stuck with XP as long as I did. Maybe 4K would help with that, but by this
point I've adjusted. And even so, spending $900 on the monitor the author
suggests is not the most appealing option. That's a lot of GPU upgrade you
could get relative to the $300 I spent on my current monitor in 2011. Or I
could double the RAM in all my personal systems and get a nice microphone
(which would have more of an impact on my working from home experience). Or I
could built an entire Ryzen 7 3700 desktop for that price. There are just a
lot of things I'd rather prioritize.

I'm also find the proposed productivity benefits dubious. As long as the fonts
are acceptable enough that I don't find them actively distracting (and as
mentioned, with bad enough font rendering that can be the case, so perhaps the
author is simply more sensitive to that than I am), how much I can focus is
going to have a lot more impact on my productivity.

I do somewhat fear that this might re-ignite my former font pickiness,
though...

------
tyho
I have the ultrafine 5k. 27 glorious inches @ 218ppi. Not only does this
provide a colossal amount of real estate for crisp text (I run with no
scaling, which requires good eyesight), but combined with a macbook pro,
provides a near perfect computing solution. One cable provides video, audio,
usb hub, webcam and power. Two monitors might be better, but I find leaving
the laptop screen open on a stand works well. One quarter of the ultrafine
provides 2560x1440 resolution, so side by side, or corner layouts work very
well.

[https://www.lg.com/us/monitors/lg-27MD5KA-B-5k-uhd-led-
monit...](https://www.lg.com/us/monitors/lg-27MD5KA-B-5k-uhd-led-monitor)

~~~
read_if_gay_
> I run with no scaling, which requires good eyesight

I have been on the fence about upgrading to a large hi-res screen specifically
with the aim of maximizing screen real estate, which effectively means little
scaling. However I couldn't find much info about running hi-res monitors with
little or no scaling. Could you share your experience and/or post a
screenshot, just to get an idea of how big UIs end up being?

~~~
tyho
Judging by the resolution of my screenshot, it appears as though there is
still some scaling going on. I can effectively reduce the scaling by zooming
out in VSCode, but I'm limited by my eyes, not by the resolution of the
monitor. In other words, the text becomes illegible far before it becomes
pixel limited.

[https://i.imgur.com/PXyOr7R.png](https://i.imgur.com/PXyOr7R.png)

~~~
read_if_gay_
Thanks, that is massive but surprisingly readable even at 27"/1440p. Nice
cowsay btw.

------
mianos
I am old old old, and I agree. This is probably something Mac users would
notice once they are used to 'retina' resolution. At work we have two monitors
on stalks and laptops. Work provides laptops to plug into these. The big
(23"?) monitors are big, that's all. They hold less text than the laptop
screen and it's much worse to read. I just use the laptop as it's easier on
the eyes. At home I always had multiple monitors but once I got a 4K monitor,
at most I have the laptop open for more screen, often CNBFed. All this is
probably more based on what you are used to. If you have big screens and
nothing of 'retina' resolution you are not going to care until you do.

------
foobarian
All my life I had an obsession with more pixels. As kids we always envied the
classmates with the highest resolution graphics cards and monitors. The
pinnacle of my life at one point was the unnamed best CRT ever made which got
up to 1280x1024, but then LCDs happened.

My new pinnacle is a beautiful 32" IPS panel on a Benq 4k monitor. I don't
care for the refresh rate jump from 60 to 120 as much as 30 to 60. But I
absolutely insist on large panel area, and lots of pixels to fill it with, so
I don't understand how the blog author can live with 27". This is basically
programmer nirvana and I don't know what could make it better, maybe some kind
of VR setup with similar PPI but I doubt it.

~~~
sneak
That's a pretty low ppi setup you describe. All of the retina iMacs are 27"
5k, so >200 ppi. A 27" 4k is 160ppi, and a bigger 4k is lower yet.

~~~
foobarian
Indeed. It could be my eyesight, but given a 4k monitor at 27" or 32", I
prefer 32" because I can see the (equivalent) information better.

------
leokennis
One key takeaway is that using external monitors connected to a Mac sucks.

My eyes are not so good and I like big text:

* On my Windows 10 laptop connected to a run of the mill 24" 1920x1080 Dell monitor, I put the scaling at 125% or 150% and everything is rendered bigger and sharp. Maybe one or two dinosaur apps are upscaled and blurry.

* On MacOS, the serious OS for graphics people, I can either render everyting sharp at 1x (with tiny menu bars) or have everything rendered as a blurry mess.

On a sidenote...I'm liking my Mac less and less. Reasons to stick with it are:
no ads in my start menu and tracking in my calculator app like on Windows, no
need to install crappy third party drivers to get peripherals to work.

Is it just me or is that insane in 2020?

------
submeta
> notebooks are not good for development. They are great in mobility and
> convenience, and this argument might outweigh everything else for some
> people. I accept that. But still, a desktop monitor + external keyboard are
> always better than a notebook.

This!

I can‘t understand how some folks can make dev work with the display of their
laptop alone, without an external monitor. And then the trend of trying to
setup the ipad pro as the sole working environment, with an external keyboard
and a trackpad. Watching those (youtube stars) trying hard to make the setup
work, them looking down on a tiny screen, the angle alone! We have had much
better setups for years, why giving up on this to work with an inferior setup.

~~~
kovek
RE notebooks vs monitor+keyboard. I have a work macbook, a personal macbook,
and a personal thinkpad. I keep work context on the work laptop, one personal
project context (ui dev) on my macbook, and one personal project context
(backend dev) on my thinkpad.

I really like that context switching is a physical experience, which feels
much more intuitive, concrete and solid as in "it works". I don't have to
open/close tabs, do cmd-tab or maybe ctrl-tab to find the right window, or
switch to another digital workspace to switch to the other context.

Additionally, those devices being laptops allows me to use them almost
anywhere under battery. Whenever I put a sweater under them, I can get them in
a comfortable position. Managing multiple tiled windows on one screen is
overhead, and so I am fine focusing on one window at a time, and so I am able
to show slightly larger font which means I don't have to squint of bend
forward too much.

I do think it would be nice if I had slightly larger screens.

------
upofadown
I just got used to looking at dot matrix fonts. So my fonts are completely
crisp even at low resolutions. In the end it is just patterns of dots.

Currently doing 6x10 fonts on a 21 inch 1920x1080 monitor. A higher rez
monitor would actually end up being a nuisance.

------
ashton314
Nice article. But for a post griping about illegibility of bad character
rendering, _why_ did they decide to make the background this egg-yolk yellow?
I'd prefer a white background to that mess. Is there some reason why they
chose yellow?

------
hyperluz
I'm typing this from a surface. And I have an iPad Air 3 too. There are people
who get eyestrain and headaches from using both devices, due to font rendering
system and frequency of lamp oscillating to adjust bright.

I even installed something called Dithering Settings for Intel Graphics and
bought/installed a program called Iris on my Surface Pro.

I searched this article for both "eyestrain" and "fatigue" with no results.

I think engineers and nerds should resolve problems regarding ergonomics and
eyestrain.

I feel like being cheated by the industry since my first Samsung Syncmaster 3,
running at 60 Hz and everyone saying it was safe and that the eye could see no
more than 60 Hz (BS!!)

------
aqme28
I wish it spoke more about framerate. I find the refresh rate affects
eyestrain for me a lot, so I got a higher framerate display.

The difference between 1440p and 4k is barely noticeable I find. I'd rather
have 1440p and high refresh over 4k and 60hz.

------
x3sphere
I wouldn’t get a 4K display since most of them are still only 60Hz. Yeah, you
don’t need a higher refresh rate to program but it makes daily desktop use SO
much more pleasant for me.

There are some high refresh rate 4K 27”s but those are pretty expensive.

~~~
TacoToni
Im personally waiting for a good 4k HDR > 60Hz monitor that is USB-C. Would be
perfect for gaming and office use plugging in my Mac & PC.

------
ranaexmachina
I just wish there were more 5K displays out there. A few months ago I bought
and returned a 4K display because I'm so used to 1440p so that I lost some
real estate with 4K at 2x scaling.

At the moment I have to upscale the fonts on most websites (e.g. 130 % on HN).
Fonts look very nice (although not as nice as on the 4K display) but most UI
elements are very small. Fortunately that isn't a real problem since my
workflow is keyboard-driven anyway, so I rarely have to click on any buttons.

In the end it's still better than the 27" 1080p display I have in my office
where you could see every damn pixel. I really don't miss working there.

------
nennes
Based on the article, it looks like a poor man's decent option is a 1440p
monitor at 60Hz that is running at 1:1 scaling.

Benefits: * Cheap-ish * Reasonable resolution with crisp text * Small
difference between 60 and 120Hz

Drawbacks: * If you want crisp text and reasonable real estate, forget any
other resolution than 1440p. It just won't align with the physical pixels and
will look horrible.

I'd love to get a 4k monitor at some point, at least that would give me the
scaling option in MacOS (1440p doesn't), but if I want really crisp text I
have to render the equivalent of 1080, which isn't that much real estate for
something like IntelliJ.

~~~
fenesiistvan
You forgot to mention a very impotant thing: what size?

------
zargon
I have been using my current home PLP setup for 10 years. Dell U3011
(2560x1600) in middle and Dell 2007FP (1200x1600) on the sides. Off and on I
have looked for high dpi replacements, but never found anything suitable.

At the beginning of the year, my U3011 started blanking out on me. As it
became more frequent, I started shopping again. But wow, the monitor situation
right now is sad. There are no high dpi monitors with enough vertical space.
And since they're all at least as short as 16:9, they're unusable in portrait
as well.

I ended up repairing the U3011. It's going to have to last me a while. There's
just no upgrade path.

------
whalesalad
I adore my HP Z27. It’s 4K, usb-C, charges my 15” MBP and serves as a
thunderbolt hub for other peripherals. The chrome or bezel around the screen
is extremely minimal, and the only light that isn’t the backlight is a tiny
little LED indicator dot that is very subtle and Apple-esque. From time to
time I use it with an old PC and even RPi’s. Real intuitive mode switching and
menus. Highly recommended along with a heavy duty monitor arm to keep it from
shaking.

As much as I miss my 2012 MacBook Air and consider it the greatest computer
I’ve ever owned... I definitely couldn’t go back to a non retina experience.

------
Aunche
I agree with other commenters that monitors are subjective, but I find it
surprising that FAANG companies are so stingy when it comes to monitors. We
get top of the line desktops and laptops, but rather mediocre monitors.

------
runawaybottle
Any suggestions that are not over 500 dollars?

~~~
josho
I have an LG ultrafine 5k, and a cheap BenQ 4k. The difference is night and
day. I hate the BenQ* and will dump it as soon a I can.

My suggestion is to hold off until you can save up and buy one of the
suggested displays when it's in your budget.

* The BenQ has a matte finish which effectively blurs everything, e.g. it's like it reduces the resolution by 10-20%. The colors are not accurate to my macbook display, despite trying to color calibrate. The brightness is dimmer as well, or looses contrast if I make it bright. Built-in speakers sound awful and volume cannot be controlled by software. I could go on.

~~~
runawaybottle
1200 feels like a lot to spend on a monitor.

~~~
josho
Agreed. I should have also mentioned that I bought my LG as a refurbished unit
which made the purchase easier to stomach.

------
mumbisChungo
Interesting article, but I'll stick with my dual 144 hz 27" 1080p monitors for
less than the cost of a single 4k 144 hz that I wouldn't be able to use for
gaming without investing another $10k in my tower.

------
brento
I need new eyes after viewing this website. I think yellow burned out my
retinas!

------
AtlasBarfed
8K TVs are coming!!! I've been an enthusiastic user of 4K TVs as monitors for
40" TVs. They're only $200-$250 for a decent one! So much real estate.

4k at 40" is basically what the DPI of a 30" 2560x1600 was.

But a 80" monitor for 8k is ridiculous. So with 8K we can finally just pick
the real estate you want and the DPI will be great.

It amuses me that the press always say "what will you watch on 8K!" ... this
is just like 4K. The "content" on 4K isn't broadcast, streaming, or disc
based. It's all generated content by game consoles and computer applications,
and upscaling.

------
tln
Damnit! This article fucked up my setup.

I just went in and fiddled with Display settings and my 4k TV doesn't show 1:1
scaling anymore.

I mean, it's Catalina issue, but still. Tonsky: May your Helvetica always be
silently replaced with Arial

------
tbirdz
Another option for low DPI users is to use bitmap fonts instead of vector
ones. I find blurriness of truetype fonts on low res displays to give greater
visual fatigue than the jagged edges of misc-fixed or terminus.

~~~
mrob
Agreed. Anti-aliasing is only needed if you're trying to imitate printed text,
which does not improve programming or writing productivity.

------
mthoms
For those using JetBrains IDE's and MacOS:

Try experimenting with the font rendering settings in the prefs. There's a
"middle" option of "greyscale-only" text aliasing in addition to the regular
off or on settings.

Depending on your monitor, font choice, and personal preferences you might see
an improvement by playing around with these.

Anecdotally, when I had an older MBP with a weak integrated GPU, lowering the
anti-aliasing setting to greyscale (or off) seeming to increase the
responsiveness and framerate on my 4K display in JetBrains IDE's. It was
particularly noticeable when scrolling.

------
fireattack
I only use hi-dpi monitor with non 100% scaling on my laptop, and to this day
I still can't get over how bad raster images look like when they're not
showing in 1:1 pixel to pixel (on web pages, mainly). It's just a blurry mess,
integer scaling or not.

I'm aware this article is mainly about text rendering, just want to point out
something I hate with hi-dpi + non 100% scaling.

Also, on Windows with 125%, I don't find text as blurry as the author showed
on Mac. They look pretty crispy to me. I guess that "scaling twice" thing is a
Mac only issue (at least for text)?

------
jedberg
I have two monitors. A 27" Apple Cinema display from 2012 that I got used in
2015, and an LG Ultra HD (4K) that I picked up last year new.

Yes, the LG is newer, 4K, has a higher DPI, etc. But I often find myself
putting my most important windows on the Apple Display. The color is just so
much more beautiful and vibrant, the clarity and contrast are superior, it's
just so much better.

Sure, I can't fit as much code on it as the LG, but boy is it pretty. Apple
definitely puts quality in their monitors. It's a shame they left the consumer
market in 2016.

------
fuball63
I still use an old boxy crt as my second monitor. It's at a low resolution,
even for it, but it's primarily for slack and video meetings. I code and
webbrowse using OSX full screen swipe mechanism on the main laptop screen.

I often reflect on how blurry and oddly colored the crt is in comparison, and
that how back in the day I never noticed or cared.

It works fine for slack, and I prefer bigger text for chat anyway. Because it
is still working, I don't see a reason to put it in a landfill or be
"recycled".

------
hinkley
I recently bought a Thunderbolt 3 monitor as what I have vowed is my 'Last
Thunderbolt 3 purchase (except replacing broken things)'.

I'm holding out for Thunderbolt 4, when I can (hopefully) treat my wiring
topology like USB instead of daisy chained all to hell and back. Because of
this, I looked at good qualities in a _secondary_ monitor that were survivable
on a primary. For instance sound quality and real estate are less important,
raising the priority of PPI, color gamut, and VESA mount.

------
asenna
I enjoyed the article, good suggestions.

Slightly off topic but I have the 16" Macbook Pro with maxed out graphics, and
as soon as I connect my monitor, the fans start getting ready for lift-off
(~5.6k RPM). It's a Samsung 4k 32" (DP 1.4 to usb-c).

I know this is a common issue but anyone else able to resolve this yet? It
drives me crazy that this powerful machine starts sounding like jet just from
connecting 1 monitor (Apple says it can support four 4k monitors).

The CPU and GPU don't look to be under any stress as such.

~~~
nickreese
Try plugging it into the right side... seriously. Makes all of the difference
on mine.

Also try a USB-C to display port if you can.

------
martopix
"This will make everything on the screen slightly bigger, leaving you
(slightly!) less screen estate. This is expected. My opinion is, a notebook is
a constrained environment by definition. Extra 15% won’t magically turn it
into a huge comfortable desktop. But at least you can enjoy that gorgeous
screen and pixel-crisp fonts. Otherwise, why would you buy a retina screen at
all?"

LOL

Has the author considered that maybe perhaps some people prefer more screen
space and less "pixel crisp fonts"?

------
testfoobar
Anyone else gets eye strain from retina macbooks? Everything looks _too_
sharp. The sharp contrast between the tiny pixels is visually perfect, but
seemingly the perfection is what is causing me eye strain. Or perhaps it is
some other effect.

Its just that when I use my retina macbook for hours, my eyes hurt. In
particular, it is a relief to go back to a older non-retina macbook that I use
for some minor projects. The non-retina macbook seems less visually taxing on
my eyes.

I would appreciate any ideas.

------
willis936
This presents itself as a treatise, but it very much is not. It totally
ignores critical aspects that should influence monitor choice.

In decreasing importance (imo):

\- cost

\- contrast ratio / black level

\- all aspects related to temporal feel that aren’t refresh rate
(response/decay time, sample and hold, backlight strobing, etc.)

\- color gamut

\- color precision (nobody wants banding from 6 bpc + FRC)

You can’t honestly make recommendations to people for $1000 monitors without
even addressing these points, even if you don’t think they’re important for
your workflow. It is very poor form.

------
jefftk
The other direction is to use a high resolution 1x monitor, without anti-
aliasing. Crisp, legible text, even at very small sizes:
[https://www.jefftk.com/xterm-vs-gnome-
terminal-4x.png](https://www.jefftk.com/xterm-vs-gnome-terminal-4x.png)

I'm currently using a 27" external monitor at 2560x1440 ("QHD") and this is
wide enough for five terminals side-by side at 83 columns each. (Monaco 10pt,
anti-aliasing off, iterm2)

------
jonathanedwards
As a geezer with glasses, I find my laptop screen works better than a monitor.
The laptop screen is down in the reading zone of my progressive lenses, yet I
can still look up out the window. A big monitor requires me to use glasses
without any distance vision, making me feel like I'm trapped in a fishbowl.

I also love the fact that my window layout is exactly the same whether I am
working at my desk or a coffeeshop. Or at least when I used to work at
coffeeshops...

------
heybrandons
Nice writeup! I recently switched from a 32" gaming >4k to a 27" 4k on my
macbook and it's incredible. Love the single thunderbolt cable and resolution
<3

------
bilalq
It seems there's no shortage of gripes with Apple's display support.

I'm still incredibly frustrated with just how much of a mess multi-display
docks are with MacOS and how it just doesn't support DisplayPort MST at all.

[https://medium.com/@sebvance/everything-you-need-to-know-
abo...](https://medium.com/@sebvance/everything-you-need-to-know-about-
macbook-pros-and-their-lack-of-displayport-mst-multi-stream-98ce33d64af4)

------
tambourine_man
I’m sorry, this guy writes about readability on a bright yellow background
site?

I also disagree with almost all of his assertions on font antialiasing, but I
think it’s not worth refuting.

------
dusted
I'm still in love with the default sized bitmap font used by xterm, I'd never
want to look at any other font if I had a choice (which I do for terminal
programs)

------
lilyball
For all of the space given to macOS "Font Smoothing", that setting really
doesn't do much at all. I'm testing on macOS 10.15.5 and the setting seems to
do nothing at all for web pages (viewed in Safari), or Terminal.app, or
VSCode, or MacVim, or Xcode's editor. It does affect some of the UI chrome in
Safari's toolbar, and Xcode's sidebar, but not the editor. So it really seems
to have no effect at all on programming.

------
gh123man
Has anyone tried using an LG OLED TV as a monitor? I do. I love OLED - however
the LG TVs have 4 subpixels (RGBW) which messes with font rendering regardless
of what settings you play with.

On Windows, chrome (including chromium edge) seems to have the biggest issue.
Windows cleartype fixes most of the native text elements but some apps like
chrome look fuzzy no matter what.

Has anyone found a good solution for this or for similar displays with non
standard subpixel layouts?

------
lazyjones
My 5 years old iMac 5K 27" monitor is just fine, thank you. But lately I'm
beginning to notice the effect of the brightness of modern displays in my
eyes...

------
SteveLTN
I'm not sure about the macOS scaling, and I couldn't find any info on the
Internet about my question.

When using non-integer scaling, does mac OS actually render fonts at 2x, then
scales it all down?

So you have a 3840x2160 monitor, set mac OS to "look like 2560x1440", it would
render at 5120x2880, then scale it down to 3840x2160?

To me, a more logical solution would be using the vector font of the target
physical pixel value (3840x2160) and skip all scaling etc.

------
starpilot
Missing from article are any specific productivity or health benefits that
come from an upgrade to 4K. For a sane guide, check out the /r/monitors google
doc link in
[https://www.reddit.com/r/Monitors/comments/h8vb4s/rmonitors_...](https://www.reddit.com/r/Monitors/comments/h8vb4s/rmonitors_weekly_recommendation_thread_what/)

------
watt
Happy user of 34WK95U 5k 34" monitor here. "Turn OFF “Font smoothing” in
System Preferences → General:" is the best advice I have heard in a year.

------
gabssnake
Was used to HDPI screens at home and previous gig. Now at work with Linux and
some cheap Dell 24"s: just zoom like 200%, until text is huge. If you only fit
80 cols like god meant to, type will be clear, you can have your monitor
further away, improve your posture and eye strain. Code might even get better.

Guess coders care little about typography, like everyone else, mostly out
laziness and lack of appreciation.

------
satvikpendem
I want to get a 240hz ultrawide personally, both for games and instead of
having 3 separate monitors (I hate the bezels between them). Samsung is making
a good one, 49 inches, HDR 1000, 240hz, and adaptive sync [0]

[https://www.anandtech.com/show/15396/sausng-
odyssey-2020-mon...](https://www.anandtech.com/show/15396/sausng-
odyssey-2020-monitors-g7-g9-up-to-240hz)

------
new_realist
I have an 8K monitor, but unfortunately it’s unusable with AMD graphics cards
(amdgpu) under Linux. The NVIDIA proprietary drivers have worked like a champ.

------
ronyfadel
I disabled font smoothing for a hot minute: text became dimmer, and contrast
was reduced. It became much harder to read text and it strained my eyes.

------
ddevault
I have three 1080p displays and one 4K display. Text looks nicer on my 4K
display, but it's just that: pretty. You don't _need_ one.

------
m0zg
If you're using a recent Ubuntu with a 4K display, the best way I have found
to make things readable without making them huge is by enabling "Large text"
under Universal Access. This sort of mimics the old Unity scaling, except you
have to set your Chrome to 150% and your console font size separately now.
Works for me on both Lenovo Carbon X1 and 32" Z32 desktop monitor.

------
mmm_grayons
Great theory. I'd definitely love to have 4k 120hz displays, but until I'm not
broke, I'll stick with my old low-res monitors I picked up for $20 apiece.
They suck, but not everyone has the money for $500 panels, let alone the
insanely-priced four-figure ones the author recommends. I'll probably just
suffer through another decade of garbage text until they get cheap.

~~~
beckingz
electronic waste recycling facility? More like christmas in my opinion.

~~~
mmm_grayons
Yep. I've got my own little fleet of machines, mostly laptops with busted
displays, that I run as servers for my own personal edification. Dirt cheap
and useful. I've even got a pentium box that still works fine.

------
rob74
As far as I'm concerned, if I am going to buy a 4K monitor, I would probably
go for a >40 inch one to be able to fit a lot of text on it rather than say a
32-inch Hi-DPI monitor. With subpixel rendering, fonts look good enough for me
even on Lo-DPI displays. I'm just afraid that with Hi-DPI becoming more and
more common, subpixel rendering will eventually disappear...

~~~
XCSme
Doesn't a 4k monitor fit the same amount of information whether it's 24" or
40", if scaling stays the same?

~~~
rob74
Yeah, but if you are going to use the same scaling for the UI at 24" as you
would at 40", your eyes might start bleeding :)

...my Android smartphone also has the same resolution (actually even slightly
higher) as the monitor I'm using to type this, but I wouldn't dream to use the
same scaling for both.

------
LordHeini
My current setup consists of three 24" 16x10 monitors.

Don't know why i should switch.

That is a total resolution of 5760*1200 plenty enough. With the added benefit
that ide, browser, terminal, mail, slack and whatever neatly snap into
position and can be reached with the press of a button.

I don't see any reason to trade sharper text rendering for a worse aspect
ratio.

16x9 just feels wrong to me; i am working and not at a cinema.

------
csomar
I couldn't care more for resolution. I use three 1080p 27" Dell monitors and
the only upgrade I did was moving to IPS which slightly relieved my eye-
strain. Now if you can find me a monitor which will reduce eye-strain, I'll
just dump money on that. For someone who uses the screen for 10-12 hours on a
normal day, headaches induced by the eyes are my biggest worry.

------
notatoad
this all seems like a lot of hassle to go through for some slightly smoother
font edges. i like high res and high refresh rates and all that other good
stuff, but i'd rather have slightly uglier fonts and not have to do an arcane
dance every time i plug in a screen - i'm pretty sure that erases any
negligible productivity benefits you might get from a better display.

------
habosa
I think a lot of people jumped straight to the comments here to discuss their
personal opinion on ideal monitor setup.

This is a really well-written and deep article by someone who is clearly an
expert. All points are well illustrated and I learned a lot about how my
MacBook renders graphics from reading it.

And yeah at the end it gets into what monitor you should buy. But that wasn't
the point.

------
lexicality
Alternatively, just make your text bigger?

I use 120% zoom on a 1080p monitor and it looks great.

Paying $900 so you can squint at tiny text a bit better seems silly to me...

~~~
cosmotic
It looks great to you but a lot of people think it looks blocky or blurry or
both, and are willing to pay more to have what they prefer.

------
abraxas
We still have so far to go with screen resolution on desktops and laptops.
While 200 dpi monitors are a huge improvement over the ridiculous 1080p or
1440p that most people still use we won't hit the zone of diminishing returns
until we get near 600dpi. Yes even from a laptop viewing distance 200 dpi and
600 dpi is like night and day if you have good eyesight.

------
6c696e7578
I love smooth fonts as much as the next guy, but lets not make that an entry
requirement for code. Don't forget that code is about logic:

[https://www.kernel.org/doc/html/v4.10/process/coding-
style.h...](https://www.kernel.org/doc/html/v4.10/process/coding-
style.html#functions)

------
spv
As some one who reads pdfs all days with lots of math in them, I cant go live
without my 27inch 5k display. This is post is on point.

------
sly010
I concur with everything said, but it is still a bit unclear to me who is this
article targeting. What is the takeaway? "Anyone who can afford a $1500
monitor should have one, so poor font designers don't have to break their back
manually hinting fonts anymore"? Isn't that a bit like solving poverty by
moving to a better neighborhood?

------
codebook
I switched to 34inch 1440p ultrawide monitor after WFH began. Previous monitor
was 27inch 4k monitor. As I put the monitor at the edge of the desk, I don't
notice significant differences of the text quality.

As I use work laptop and personal desktop both, changing to ultrawide that
supports USB KVM is god blessed. It removes any clutter to switching between
the PCs.

------
errantspark
Or you can just use pixel fonts (which imo look better than subpixel-aa fonts
on 4k anyway) and donate the difference in monitor costs to [optimal virtue
signalling charity]. Gaming on a 4k monitor sucks also, I don't know what kind
of supercomputers people are running but my meager 1080Ti can't peg any of the
relevant games to 144Hz at 4k.

~~~
mdorazio
> 144Hz

That's your problem right there. Most people, myself included, are perfectly
fine at 60Hz since we don't do competitive FPS gaming. At that refresh rate,
my 2070 is perfectly happy to render games in 4k and I really don't notice any
significant difference compared to 120Hz. If > 60Hz is really important to
you, it's going to be another generation of video cards before single-card
gaming can handle it well (probably the 3000 series cards will be capable).

------
thih9
Offtopic, I really like how the twitter post has been embedded as a screenshot
and not using twitter's scripts.

Even if that means that I can't click any link in a tweet; I probably wouldn't
have noticed otherwise.

I prefer the UX (and especially: fast load times) of an image to the one of a
script. Also, an image means fewer requests and fewer potential privacy
issues.

------
Pmop
Currently, I work with a 720p 12.5" display, and fixed point bitmap fonts. I'm
moving like crazy these days, can only rely on laptops. And Retina Macs are
quite expensive where I do live; plus, we are not allowed to import used goods
from other countries, and importing a new Mac yields a customs tax that's very
painful to pay.

------
kirstenbirgit
The "120hz dance" reminds me of my "drag my 2 external monitors around in
System Preferences every time I dock my MacBook, otherwise they switch
places"-dance.

It's amazing that all these display bugs still exist when Apple has presumably
invested so much into their $6000 Pro Display.

Dear Apple: Why not spend a little time making the software work, too?

------
fomine3
I'm waiting for 4K/32inch/120Hz monitor without local dimming(because it's
power hungry) but stil no releases. Maybe due to most manufactures want to add
HDR(10bit color) but 4K/120Hz/10bit exceeds DP 1.4 bandwidth so need to reduce
color to 4:2:2 or use DP 2.0 or HDMI 2.1.

I really hope 120Hz going to standard for work monitor.

------
pmiller2
This is all very interesting. I've recently, and very seriously considered
getting a medium sized UHDTV (say, 40-50") to use as a display. The purpose of
this would mostly be to render very high resolution photos while keeping more
of the image visible at full resolution than is possible on a smaller screen.

Anybody actually gone and done this?

------
hrayr
Well timed article for me. I'm looking to buy an external monitor to go along
with my older (mid 2015) 15" MBP.

I was searching hacker news and elsewhere for opinions last night. I got
confused, frustrated and gave up. Today is no exception. The only thing I know
is, I want a large hi-res monitor that is at least comparable to my laptop.

~~~
tonsky
4k 60Hz should work no problems. Last Macbook that didn’t supported it was
2012 retina.

------
walrus01
> It’s also possible to run a 4k display at native 3840×2160 pixels. It
> depends on the size of the display, of course, but in my experience, even
> 27” 4k displays are too small to run at 1×. The UI will be too tiny.

There is a really nice LG 43" monitor (not a TV, it uses displayport) which is
optimal for using 1:1 pixel scaling.

------
Aaronstotle
For me, I find 1440p + high refresh rate as the true sweet spot, the
"affordable" monitor listed in this post is $900.

That's not a bad deal for a 4k high refresh rate monitor, but if you play any
games you would need at least a 2080 or 2080ti which is another 700-1200,
1440p high refresh monitors go for around $300-400.

~~~
duncanawoods
I struggle to hit 60Hz at 4k with 2080TI even with graphics settings knocked
down a few notches. Maybe next generation.

~~~
kenhwang
1440p144hz seems to be the sweet spot that real world single graphics cards
can max out on.

------
snailerz
Interesting reading, but... I can't understand why he states that a 4k monitor
at 1.5x is worst than a 1440p monitor at 1x...

Pixel density will be better, for example, in game textures and movies?

I am really interested in buying a new monitor from quite a while, but I
really don't get that part, can somebody help me with that?

Thanks in advance HN folks!

------
jiveturkey
I'm reminded of the 90s, I landed a job using IRIX as my desktop. In the days
of 72dpi monitors, the 144dpi 21:9 SGI display was a wonder! I _think_ it also
worked on the Mac, but there was no scaling like there is today, where there
is a render pipeline and application pixels != screen pixels.

------
nonick
Too yellow, couldn't read. Changing to dark mode resulted in black fonts on
black background. End of story.

------
bjoli
I inherited an old iMac with a 21" 4k monitor. There is simply no going back.
I just ordered another one to have 2 of them. Everything looks better and even
though almost no displays in a good price range have the same cd/m2 ("nits"),
I am still looking forward to my new 2x4k life!

------
johncalvinyoung
I hate the idea of scaling and blur, but I find I really like 6K-equivalent on
my 27" 4K monitor at my desk (driven by a MBP below the monitor). And no, I
don't have budget for a Pro Display XDR, as beautiful as it is. If I could get
an IPS 6K panel for 15-2500, I'd save my pennies.

------
komali2
> Well, the split does not exist anymore. Since not that long ago (yes, I’m
> too lazy to check) you can have both! You can have a 4k monitor that runs on
> 120 Hz. In fact, that discovery was the main motivation for this article.

With... good color gamut? Without ghosting? If so I want to buy that monitor
today.

------
arturb
As a web developer, I have Eizo ev2785 with 125% scaling and it worked fine so
far.

From code maintenance perspective, I noticed that if you feel that there isn't
enough space on your screen, it might be the right time to refactor and split
it to the smaller chunks: extract another view partial, class etc.

------
thekevinscott
For the past decade I’ve been holding out hope for a competitive e-ink monitor
to use for coding, but it seems we’re still years away. Dasung and Onyx both
offer (somewhat small) monitors but from what I can tell the reviews are less
than stellar.

A great e-ink monitor - _that_ would be my dream monitor.

~~~
newman8r
I'd love a monitor like that for off-grid, low power use. I'd probably also
want a linux distro specifically tailored to an e-ink format.

------
JayGuerette
37% of programmers are using Retina displays? Since the OSX market share is
~8.5%, your numbers are way off.

------
vcsilva
The author claims that high-resolution displays are a commodity now. That may
be true, but those can be really expensive depending on where you live. In
Brazil, most laptops are still (sadly) sold with 1366x768 displays. 1080p
displays are that much more expensive, let alone 4K.

------
sbierwagen
>But even today you can peek into the future, if you have extra $4,000 to
spare. This is Dell UP3218K, world’s first and only 8k monitor:

Note that the UP3218K isn't the brightest monitor in the world. I've read some
reviews that claim you need a darkened room to see the full color gamut.

~~~
tonsky
I don’t really understand the concerns about brightness. On my current display
I work at 20% brightness, otherwise my eyes start to hurt.

------
dijit
I have been looking to upgrade my aging U2412m's for some time.

But 16:10 is dead; so why bother? it might only be a little horizontal space
but it's a huge difference to not have it.

Thankfully recent laptops are returning to 16:10 (Macbook's and Dell XPS's)
But monitors still do not seem to exist.

~~~
robertoandred
MacBooks never left 16:10, thank god...

~~~
dijit
I seem to recall my macbook (2011) being 16:9, but it turns out 1280x800 is
actually 16:10!

It seems you're right for the retinas and upwards too, that's heartening[0]

[0]
[https://www.theverge.com/circuitbreaker/2018/4/19/17027286/l...](https://www.theverge.com/circuitbreaker/2018/4/19/17027286/laptop-
widescreen-aspect-ratio)

------
0-_-0
I have the same monitor as the author (Acer Nitro XV273K) and it's simply
amazing with VSCode. It makes working from home a very pleasant experience,
I'm not looking forward to going back to the office and a 1080p monitor... Not
to mention how good games look at 4K 120 fps.

------
s_y_n_t_a_x
After going to 1440p, I cannot use 1080p monitors for programming as the text
is too blurry.

Honestly I don't see the need of 2160p on a monitor, it usually just causes
display issues and it's a performance drain.

For gaming/everyday use, 2k w/ a high refresh rate is the sweet spot imo.

~~~
mgr86
It has been years since I gamed regularly but my brother and I were probably
some of the best at a particular FPS. It has to be about 15+ years ago. When
we played in the physical presence of others we noticed that we were some of
the only ones to turn the display settings way down while keeping resolution
the maximum. Probably 800x600 or 1280x1024 in those days. The game ran
smoother, and the less complex textures made movement standout.

~~~
Aaronstotle
Was it quake by any chance?

~~~
mgr86
No, but in 2002/3 we would play Quake during our CCNA classes every now and
again instead of class work. The hardware was old so it was the original
quake.

I played a lot of MOHAA[0].

[0]
-[https://en.wikipedia.org/wiki/Medal_of_Honor:_Allied_Assault](https://en.wikipedia.org/wiki/Medal_of_Honor:_Allied_Assault)

------
Finesse
How to enable subpixel antialiasing on latest macOS:
[https://apple.stackexchange.com/a/337871/188791](https://apple.stackexchange.com/a/337871/188791)

------
nuc
I have a Dell P2415Q and even though it's relatively old, the density is even
higher than the monitors mentioned there.

It doesn't have the feeling of macbook's retina, but still is good enough.

Wouldn't these monitors feel that they have even less clarity that this dell
one?

------
screye
I would strongly advocate for a 1440p 34" 21:9 ultra widescreen monitor.

The 21:9 aspect ratio is perfect for having the standard IDE + chrome tab
arrangement. It doesn't have the issues brought in by having multiple monitors
and 1440p is a good middle-ground resolution.

------
mcny
I bought a 1080p 27" HP 27yh for under USD 100 last Black Friday.

I thought my monitor would be squarely middle of the pack but turns out it is
barely above 80 ppi

Display size: 23.53" × 13.24" = 311.5in² (59.77cm × 33.62cm = 2009.68cm²) at
81.59 PPI, 0.3113mm dot pitch, 6657 PPI²

~~~
the_af
The most important thing here: does your new monitor look good for you? Do you
like how it renders graphics/text/whatever-you-use-it-for? If so, please
disregard articles like TFA. They are trying to convince you of something you,
by definition, don't need.

~~~
mcny
Yes, the monitor looks fine. It is connected to an old Dell Optiplex 390 with
an i3-2100 and 8 GB of memory. I don't watch movies or play video games on it
or anything. It is basically a glorified terminal for me to citrix/remote
desktop to work.

I've found my off-brand (Aukey) "blue" mechanical keyboard has been a greate
improvement in my quality of life though (even though it is not very
ergonomic).

It is funny how when I started using Visual Studio around 2008 I didn't have a
1920x1080 monitor and I wanted to see all the panels and it was so painful.

Right now, my biggest pain point is my horrible Internet (Wi-Fi) connection.
It is especially painful because I move my mouse or type something and nothing
appears on the display and I don't know whether the remote computer is slow or
my network connection is crapping out again.

Almost feels like I am whining about a non-issue because even fifteen years
ago, I was on a dial-up "soft" modem and it would have been unthinkable to get
pretty much live full 1080p remote desktop.

------
tarsinge
Wow just followed the steps for my retina Macbook and I love it, my thanks to
the author.

------
skizm
Personally I can’t go back to a 60hz monitor at this point. So really can’t go
to 4K until there is a reasonably priced 144hz monitor with low input lag and
response times. I’m okay sacrificing color for speed (TN panels are okay with
me).

------
mister_hn
Using a 4K monitor with Gnome Shell and scaling 2x. I find it very pleasant
for my eyes

------
andai
Surprised there's no mention of bitmap fonts. Solves all the problems for free
:)

------
jmull
I'm glad I made it to the part that recommended turning off font smoothing on
MacOS with retina display.

I tried it (I guess I've had it on all this time) and, indeed it looks better.
(This is on a recent 27" iMac with Catalina.)

------
euske
I turn off any font/scroll smoothing and animation whenever possible. I use a
good bitmap font for coding. It's readable and snappy. No hinting BS. There's
no need for animation in Emacs or tmux after all.

------
musicale
> Not anymore, after Apple started applying non-integer scaling by default on
> Macbooks

That is the first thing I fix on an Apple laptop - change it to 4x HiDPI
scaling!! I was very annoyed when they changed the default.

------
mindentropy
It is just me that I used to find CRT monitors much more soothing and soft to
my eyes. I feel the LCD monitors to have a very synthetic contrast, brightness
and colors which I find harsh to my eyes.

------
dmoy
You can pry my cheap A- panel Korean no-name brand 8+ year old monitor with
one working button (thankfully the power button!) and a bunch of stuck pixels
from my cold, dead, miserly hands.

~~~
smabie
Why?

~~~
dmoy
Mostly the cheap and miserly part

It works fine, and the cost/benefit of buying a newer and/or more expensive
monitor isn't worth it to me.

I also am the kind of person who drives a <$10k car even though I could pretty
easily afford a >$50k car. Tradeoff isn't worth it imo.

------
nahtnam
I just recently went from 2 4k monitors to 3 2k 144hz monitors. I'd argue as
long as the panel quality is decent, 144hz is a much better quality of life
improvement than 2k to 4k is

------
pao
Those are the ugliest monitors I've ever seen. I thought all monitors were
basically boring rectangles. No, apparently there are monitors with weird
designs on them and shrouds.

------
genpfault
So where are y'all finding 4k, high refresh-rate 16:10 panels?

------
megous
For coding I just use bitmap fonts, and I don't have any blurring issues this
blog describes. I can do that even on old IBM T41's screen, and text still
looks just fine.

------
ChrisMarshallNY
That's a great, comprehensive article!

But I'm old, and most of the effort is wasted on me.

What I need, is real estate.

I have that. I use an ultrawide (5120 X 1440) monitor, broken into two screens
(3440 X 1440, 1680 X 1440).

Works great.

------
gautamcgoel
I use a 2k Dell monitor which I picked up for around $400 a few years ago. To
me, this strikes a nice middle ground between high resolution and
affordability/comparability.

------
40four
Opinions or preferences aside, I thought this was very well written! Enjoyable
to read, and I learned a lot about pixels, fonts, screens et-cetera that I did
not know before!

------
zackees
Best screen experience is a 6-foot TV at 4k @ 60hz.

This allowed me to move the screen back 6 feet. The result was an improvement
of my eyesight (I'm near sighted) within 6 months.

------
perryizgr8
More than resolution, it is aspect ratio that I want. I love my 1920x1200
monitor. I wouldn't trade it for a 2560×1440. The 16:9 ratio is simply too
short.

------
wdb
I think the creator of this site should revisit its dark mode feature because
when I enable it I am getting black on black which makes the text hard to
read.

------
shocks
I recently abandoned three 24” hd monitors for a single 43” 4k and I couldn’t
go back.

All the desk space, loads more screen space. I don’t need high dpi as a
programmer.

------
TheBlerch
2k (2560 x 1440) seems to be ideal for 27 in. monitors. Does anyone really
find 2k insufficient on a 27 in.? And I find 1080p fine on a 24 in. monitor.

~~~
andrewzah
This is what I do. 1x27" 1440p horizontal @ 240hz, 1x27" 1440p vertical @
60hz. It's nice.

------
ralmidani
I actually “downgraded“ from a 43” 4K to a 30” (2560 x 1600). Trying to focus
strained my eyes, and the brightness from the big monitor meant I had to
position it as far back as I could on my desk, which made it even harder to
focus and caused even more eye strain. I also realized a 16:10 ratio is more
friendly for coding.

Frankly, 4K/5K monitors seem like a gimmick for most people. Especially
puzzling is why you would pack so many pixels into smaller (~27“) monitors and
require more power and graphics muscle for imperceptibly “better” images.

------
ngcc_hk
Really good essays. It is like reading old date ars technica the long reviews
of Mac OS. Many things you notice but not sure what is that.

------
platz
I was thinking about a 4k, but i'm not sure I will know how to get everything
to run at 2x scale, on both windows and linux.

Is this hard to do? Caveats?

~~~
ComputerGuru
It’s the default under both Windows 10 and Gnome 3 for hi-dpi displays. In
fact, it’s near impossible to get fractional dpi working across the board on
Gnome 3.

~~~
platz
I use i3 but I'm sure there are Nvidia settings to control the x display
somehow

~~~
ComputerGuru
I'm not sure how, but GTK3 applications violate the layers and are not
affected by the nVidia setting. I think it's because nVidia sets a fake Xorg
resolution (e.g. twice your actual) but GTK3 sees through that and uses its
own internal dpi setting. I also presume this would be the case regardless of
whether Gnome 3 is your DE/WM or not.

------
shmerl
_> I don’t really care for wide gamut or even proper color reproduction._

I do. Having irritating colors makes looking at text annoying as well.

------
ipsin
I thought I was doing a champion job of leaving a trail of e-waste behind me,
but here I see I'm not doing _enough_.

------
mv4
I work with documents a lot, and my current preference is two ultra-wide
monitors, allowing me to have 4 windows side-by-side.

------
ericls
I've seen people making great code and great product on a 1280*800 monitor.

And sometimes working with limitations is the fun itself.

------
fortran77
His "research" was a Twitter poll?

But I basically agree. I use a pair of 4K monitors and it works well. I run
the UI at 150%.

------
epx
I opted for spending money in a 34" 21:9 display which is just 1080p in
vertical axis, and it was not cheap.

------
ausjke
Except that my PC and laptop can not do 4K/120, and yes they both are powerful
enough for my daily use.

------
swordsmith
The entire time I'm reading the post, I'm thinking the yellow background
really hurts my eyes...

------
neilobremski
And I suppose writers must have a Moleskine notebook and use clear penmanship?
Hmm ...

------
pabs3
Why does user interface rendering care about pixels and DPI instead of field
of view?

------
alliao
interesting, this must be why animal crossing looks so damn good on my 4k
OLED....

------
yc-kraln
Checking in with 2xWQHD @ 144hz. Definitely invest in a good set of
monitors...

------
HiddenCanary
The author mentions to go for a 120hz monitor. Is a 144hz one also sufficient?

------
holtalanm
well, time to throw my brand-new BenQ 24" 1080p monitor in the garbage, I
guess.

but really....1080p is just fine for looking at text on a screen. Maybe the
reason the text is blurry for you is that _you need glasses_.

------
eqtn
i have a 4k 27" paired with a thinkpad. On ubuntu with gnome there is a huge
performance dip on x11 with fractional scaling turned on. So i use this at 1x
with font scaling of 1.25x.

------
arpa
However did we survive programming on CRTs displaying text at 80x25?

------
bfung
Longest ad I’ve read in a long time! Well done and done right! =)

------
delduca

        document.body.style.background = '#fff'

------
gowld
This person recommends 3 monitors, NONE of which appear to offer adjustments
that are critical for ergonomic health. I don't find this perspective
trustworthy.

I'm not going to sacrifice my spine for slightly fancier characters.

------
easytiger
Was most productive working on a 800x600 15" CRT.

------
pmarin
I prefer the low cost solution of using pixel fonts.

------
aizatto
I disagree with this article.

Higher resolution (to an extent) allows me to see more stuff.

4K on a 27" doesn't work for my workflow.

My workflow on a laptop is to put two windows side by side so that I can
compare things/read documents/code against documentation.

Unfortunately most websites nowadays are designed for widths of 1280px, and
anything less than that is sometimes treated as mobile or gives it a terrible
responsive design. It frustrates me, endlessly.

On a 16" MBPr I can get 2048x1280, which allows me to put up two windows side
by side at 1024px, which does work for most sites, but sometimes there are
some sites which are just broken.

A big reason why I need larger windows is when doing side by side PR Reviews
on GitHub, 1024px can only show 56 characters, 1280px can only show 78
characters. Personal opinion, GitHub has too much white space. I tend to have
another window open to compare against documentation/etc.

I've compared it at

[https://github.com/aizatto/character-
length](https://github.com/aizatto/character-length)

[https://github.com/aizatto/character-
length/commit/bae8f00fe...](https://github.com/aizatto/character-
length/commit/bae8f00feda5b832aa6fe162460968d8eaf040a5)

For desktop:

I'm comfortable is 27" 2560x1440. Two windows side by side at 1280px. Pixel
pitch is 0.2331mm

Using 4k at 27" is too small for me. Pixel pitch is 0.1554mm . Without the
correct dongle/cable MacBooks can't even push at 4k 60hz via HDMI. Tip: use
DisplayPort.

I've settled on 38" 3840x1600 because it allows me to setup three windows side
by side at 1280. Which works great for my workflow.

Which sometime tends to be:

\- 1 window for source material \- 1 window for my main focus \- 1 window for
a comparison material

38" 3840x1600 pixel pitch is 0.229mm.

I settled on the Dell U3818DW, because it can charge my MBP via USB-C. Only
recommendation is if your laptop is asleep, don't leave your laptop plugged
int. It likes to wake it.

The 1600 also gives me some extra room vertically.

If you can afford it, I'd recommend giving 3840x1600 a try.

A list of such monitors
[https://pcpartpicker.com/products/monitor/#r=384001600](https://pcpartpicker.com/products/monitor/#r=384001600)

------
WrtCdEvrydy
Is it wrong that I'm at 720p 120hz still?

------
bovermyer
The author is very clearly not a gamer.

------
nothal
The article itself is interesting but I couldn't stop snickering after the
author cited a Twitter poll as their research.

------
noja
Ultrawide beats 4k for me.

------
rkagerer
TLDR: Buy a high-end gaming monitor, and plug into GPU with modern
connectivity... to make your fonts look better.

------
M5x7wI3CmbEem10
e-ink monitors, anyone? saves your eyes in the long-run

------
delduca
document.body.style.background = '#fff'

------
AnthonBerg
OCD.

------
jrockway
As an early adopter to monitor technology, I have to say that I consistently
regret it.

I got one of the first 4k mainstream monitors. I paid $3000 for it. This was
back in the day when DisplayPort didn't really "do" 4k, so it was done by
pretending it was two monitors internally. This broke EVERYTHING. For years, I
struggled with Linux trying to treat two monitors as one big one (and putting
new windows right on the border). Welp, they fixed that. But trying to treat
two monitors as though it's one was completely impossible. I eventually got it
to work by enabling a bunch of random features in the nVidia driver, that when
enabled together triggered a bug that broke Xrandr, so everything thought I
just had one monitor. (I could not, of course, add a second monitor.)
Miraculously, they never fixed that bug. It worked for half a decade at least.
(At some point in there I switched to Windows, which of course supported it
perfectly because the driver was specifically hacked to detect that model
number and do extra stuff.)

Several years later, I wanted to get a monitor that supported more colors than
sRGB. Big mistake! While inexpensive, I learned that NOTHING supports color
spaces correctly. The Adobe apps do, but that's about it. Online image sharing
services go out of their way to MODIFY the color space that you tag an image
with, so there is no hope of anything ever showing the right colors unless you
manually clip them to sRGB. Things like the Win32 API, CSS, etc. have no way
to say what color space a color is encoded in, so there is no way to make the
operating system display the right color. ("background-color: #abcdef" just
means "set the color on the user's display to #abcdef", which is a completely
meaningless thing to do unless your working colorspace is sRGB, and the user's
monitor works in sRGB. It worked for years, but was never correct.) The worst
thing is, nobody appears to care. ("It just makes colors more vibrant!"
they'll tell you) Big mistake. Do not buy unless you never want to see a color
as the author intended ever again. (I solved my photography colorspace problem
by switching to black and white film. Take that, colors! You can't display
them incorrectly if there aren't any!)

The next thing I jumped on is high refresh rate. I waited until 144Hz IPS
panels were affordable, and got one. It sure is better than 60Hz, which looks
like a slideshow, but there are of course problems. The first is... it is
pretty optimistic to think that an IPS panel will actually update at 144Hz.
They do not. The result is blur. I run mine at 120Hz with ULMB (which
basically strobes the backlight at the display update rate). That looks really
good. There are some artifacts caused by the IPS display, and 120Hz is
noticeably slow, but moving things sure are clear. You can pan a google map
and read the labels as it moves. Try that right now on your 60Hz display, you
can't do it!

But because of IPS, at 144Hz without ULMB, you get a smooth mush. At 120Hz
with ULMB, you can read moving content (like player names attached to people
in an FPS, it is trippy the first time you use it). Having said that, it's bad
for anything that doesn't render at 120Hz. Web browsers, games, CAD... great!
Videos... AWFUL, just awful. On a 30/24Hz video, frames get strobed 4 or 5
times, and this causes your brain to think "hey, a slide show". (You can
record the display with a high speed camera, and it will look completely
different from what you see in real life. Darn brain, always messing things
up.) Things like pans skip and jerk, as your brain tries to interpret the
video stream as a series of <image 1> <black screen> <image 1> <black screen>
<image 2> <black screen> ... instead of a smooth blend of <image 1> <image 2>
... You can post-process the video to "invent" frames in the middle, so your
monitor displays new image data each time it strobes the backlight. I do this
with mpv and it looks great. But if you watch video in a browser, you are out
of luck.

My TL;DR here is that buying any sort of fancy monitor is just going to make
you very unhappy. You will learn everything in the world there is to know
about color space math, pixel transition times, using high speed cameras to
debug issues (what a time sink), how your brain processes moving images, etc.
It won't make you any happier. It won't make you better at programming.

If you play competitive games, get a TN 1080p 240Hz monitor, simply because
that's what everyone else uses. Don't use it for anything except the game,
because every second that you use it it will make you unhappy. But it's
absolutely a joy to play a game on it. (Why 1080p and not 1440p? Guess who
bought a 1440p 165Hz monitor. Not anyone that has ever contributed code to the
game or played the game at a professional level. But I did! Guess who gets to
live with the bugs.)

If you are a programmer, just buy whatever. Every single monitor ever designed
will make you unhappy.

If you are a programmer who works with color, get yourself a good therapist.
You will be meeting with them on a daily basis, and even then, you'll still be
scarred for life. It's all about damage control at this point.

------
prepperpotts
My productivity has skyrocketed after our company moved remote, I think in
large part because my home monitor is good and my office monitor was garbage.
Staring at blurry text all day messes with your brain a lot more than you
realize.

------
growlist
I dislike anti-aliasing, and far prefer a pixel font (or whatever the correct
term is) like Terminus:

[https://files.ax86.net/terminus-ttf/](https://files.ax86.net/terminus-ttf/)

------
the_cat_kittles
i got two of these for ~550. i really dont know what more you could want if
you arent editing videos or photos or whatever:
[https://www.ebay.com/itm/Acer-KG1-28-Gaming-
Monitor-4K-3840x...](https://www.ebay.com/itm/Acer-KG1-28-Gaming-
Monitor-4K-3840x2160-1ms-GTG-60-Hz-330nit-TN-AMD-Free-
Sync/264481989120?ssPageName=STRK%3AMEBIDX%3AIT&_trksid=p2057872.m2749.l2649)

------
awtawtwat3aw
lol @ monitors, new OLED tvs have hdmi 2.1, barely any bezel, and way better
response/picture than any "monitor".

burn-in isn't the issue it once as.

will take years for people to realize this, though. feels good to live on the
edge.

~~~
Tepix
They are really large, however.

I'd like to see a 32 inch 6k OLED TV.

