Hacker News new | past | comments | ask | show | jobs | submit login
Time to Upgrade Your Monitor (tonsky.me)
1220 points by neonbones 26 days ago | hide | past | favorite | 1012 comments



The technical details are all right (or seem right to me, anyway), but this is too opinionated for my liking. No, you do not need a 4K monitor for software development. Some people might like them, some won't. [edit/clarification: someone rightfully pointed out that nobody will actively dislike a 4K monitor. I was unclear here: I meant "some people won't need them" more than "dislike them"]

This sounds like when Jeff Atwood started that fad that if you didn't have three external monitors (yes, three) then your setup was suboptimal and you should be ashamed of yourself.

No. Just no. The best developers I've known wrote code with tiny laptops with poor 1366x768 displays. They didn't think it was an impediment. Now I'm typing this on one of these displays, and it's terrible and I hate it (I usually use an external 1080p monitor), but it's also no big deal.

A 1080p monitor is enough for me. I don't need a 4K monitor. I like how it renders the font. We can argue all day about clear font rendering techniques and whatnot, but if it looks good enough for me and many others, why bother?


Hello! Person who actively dislikes 4k here. In my experience:

1. No matter what operating system you're on, you'll eventually run into an application that doesn't render in high dpi mode. Depending on the OS that can mean it renders tiny, or that the whole things is super ugly and pixelated (WAY worse than on a native 1080p display)

2. If the 4k screen is on your laptop, good luck ever having a decent experience plugging in a 1080p monitor. Also good luck having anyone's random spare monitor be 4k.

3. Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming. I use i3 and it adds way more productivity to my workflow than "My fonts are almost imperceptively sharper" ever could

My setup is 2x24" 1920x1200 monitors - so I get slightly more vertical pixels than true 1080p, but in the form of screen real estate rather than improved density. I also have 20/20 vision as of the last time I was tested.

My argument in favor of 1080p is that I find text to just be... completely readable. At various sizes, in various fonts, whatever syntax highlighting colors you want to use. Can you see the pixels in the font on my 24" 1080p monitor if you put your face 3" from the screen? Absolutely. Do I notice them day to day? Absolutely not.

I genuinely think 4k provides no real benefit to me as a developer unless the screen is 27" or higher, because increased pixel density just isn't required. If more pixels meant slightly higher density but also came with more usable screen real estate, that'd be what made the difference for me.


> 1. No matter what operating system you're on, you'll eventually run into an application that doesn't render in high dpi mode

You lost me right here on line 1.

If there are apps on MacOS that can't handle high dpi mode, I haven't run into them as a developer (or doing photo editing, video editing, plus whatever other hobbies I do). Also, I don't have any trouble with plugging my highDPI MacBook into a crappy 1080p display at work.

> 3. Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming.

Things like this are exactly why I left Linux for MacOS. I absolutely get why you might want to stick with Linux, but this is a Linux + HighDPI issue (maybe a Windows + highDPI issue also), not a general case.

> I genuinely think 4k provides no real benefit to me as a developer unless the screen is 27" or higher, because increased pixel density just isn't required.

You could say the same for any arbitrary DPI; 96dpi isn't "Required", we got by fine with 72dpi. It's all about ergonomics as far as I'm concerned.


> You lost me right here on line 1.

I'm on Windows and can confirm 1. is an issue. Windows lets users scale UIs to be readable at high DPI on small screens. Doesn't work on all UIs (e.g. QGIS). So maybe not all OS's, but two important ones.

> You could say the same for any arbitrary DPI; 96dpi isn't "Required", we got by fine with 72dpi. It's all about ergonomics as far as I'm concerned.

I think the point the parent is making is that human vision has limited resolution. I.e. for a given screen size & distance from the screen, you cannot notice any difference in DPI past a point. The parent is suggesting that 1080p & 27" with a typical viewing distance is already higher resolution than the eye can resolve. Looking at my 1080p 27" screen from a metre away with 20/20 vision I am inclined to agree!


>I think the point the parent is making is that human vision has limited resolution. I.e. for a given screen size & distance from the screen, you cannot notice any difference in DPI past a point. The parent is suggesting that 1080p & 27" with a typical viewing distance is already higher resolution than the eye can resolve. Looking at my 1080p 27" screen from a metre away with 20/20 vision I am inclined to agree!

Are you sure you have 20/20 vision? I can absolutely resolve individual pixels with zero effort whatsoever on 1080p 27-inch displays.

Back when I had a 27-inch 1080p display at work, my MacBook's 13-inch Retina Display effectively became my main monitor. The 27-inch monitor was relegated to displaying documentation and secondary content, because I found its low resolution totally eye straining

Edit: I might have found it so eye straining because MacOS does not support sub pixel rendering. That means a lot of people will need a 4K or Retina monitor to have a comfortable viewing experience on the Mac.


MacOS does support subpixel rendering, has at least since the early to mid 2000s. One or two versions back though they turned it off by default since it isn't necessary on HiDPI "Retina" displays and they only ship HiDPI displays now.

You can still turn it on although it requires the command line.


Subpixel rendering dramatically slows down rendering text. When you have a high res screen, and want everything to be 120fps, even text rendering starts to be a bottleneck.

That combined with the fairly massive software complexity of subpixel rendering is probably why mac dropped it.


Been a while since my eyesight tested, but I think so! I can see pixels if I focus but not when reading text at any speed. I have also checked and my display is only 24" (could've sworn it was more!) so maybe that's why. I retract my comment :)


> I think the point the parent is making is that human vision has limited resolution.

If you can't see the difference between 4k and 1080p on a 24" monitor, then you probably need reading glasses. On a 27" monitor it's even worse. It's not so much that you can "see" the pixels, sub pixel rendering and anti-aliasing go a long way to making the actual blocky pixels go away, the difference is crisp letters versus blurry ones.


Yes, I can see the difference, but I (personally) don't notice that difference while reading. I do notice a big difference when using older monitors with lower DPI compared with 1080p on a normal-sized desk monitor, however.


On Windows blurry letters depend on the font: some fonts have blurry letters, others have crisp letters.


> then you probably need reading glasses

Maybe, just maybe, one could talk about how crisp text appears on 4k _without_ being rude.


As a glasses-wearer, this does not seem rude to me.


> I'm on Windows and can confirm 1. is an issue. Windows lets users scale UIs to be readable at high DPI on small screens. Doesn't work on all UIs (e.g. QGIS). So maybe not all OS's, but two important ones.

Haven't seen any scaling issues on Windows in years. Last time was Inkscape but they fixed that.


I see these issues all the time, with enterprise desktop apps. The scaling is only really a problem because it is enabled by default when you plug in certain displays. If the user made a conscious choice (which they would easily remember if they had trouble), it would be fine.


For many, many years there were at the very most 120 dpi monitors, with almost all being 96, and I imagine a lot of enterprise applications have those two values (maybe 72 as well) hard-coded and don't behave properly with anything else.

I know my company's ones do.


I'm currently working from home, accessing my Windows 10 desktop machine in the office via Microsoft's own Remote Desktop over a VPN connection. This works fine on my old 1920x1280 17" laptop, but connecting from my new 4k 15" laptop runs into quite a few edge cases, and plugging an external non-4k monitor has led to at least two unworkable situations.

I've now reverted to RDP-ing from my old laptop, and using the newer one for video calls, scrum boards, Spotify and other stuff that doesn't require a VPN connection or access to my dev machine. It mostly works OK in that configuration.

I've seen other weird things happen when using other Terminal Services clients, though.


> Also, I don't have any trouble with plugging my highDPI MacBook into a crappy 1080p display at work.

Low DPI monitors are pretty much unusable since MacOS dropped subpixel rendering - with fonts being a blurry mess. You can only really use MacOS with high DPI monitors now for all day working. It’s a huge problem for everyone I know who wants to plug their MacBook into a normal DPI display. Not that the subpixel/hinting was ever that good - Linux has always had much better font rendering in my opinion across a wider range of displays.


> Low DPI monitors are pretty much unusable since MacOS dropped subpixel rendering

Nonsense, fonts look fine on non-Retina monitors; they were fine on my old 24" 1920x1200 monitor and are fine on my new 27" 2560x1440 one. Can I see a difference if I drag window from the external monitor to the built-in Retina display? Yes, but text is not blurry at all on the external monitor.

If it matters, "Use font smoothing when available" is checked in System Preferences (which only appears to have an effect on the Retina display, not the monitor).


That's been my experience, too. I prefer high-DPI monitors, but back when I was going into the office (remember going into the office?) and connecting my MacBook to a 1920x1200 monitor, text was perfectly readable. I suppose if I had two low-DPI Macs, one running Catalina and one running, I don't know, High Sierra, I might be able to tell the difference at smaller font sizes,

As an aside, I wonder whether the article's explanation of how font hinting works -- I confess for all these years I didn't know the point of "hinting" was to make sure that fonts lined up with a rasterized grid! -- explains why I always found fonts to look a little worse on Windows than MacOS. Not less legible -- arguably hinted fonts are less "fuzzy" than non-hinted fonts on lower-resolution screens, which (I presume) is what people who prefer hinted fonts prefer about them -- but just a little off at smaller sizes. The answer is because they literally are off at smaller sizes.


These things are fairly subjective. But it’s hard to argue that Catalina has good font rendering on regular DPI screens. I dealt with it when I had to, but it was very poor. There are also tons of bugs around it. Like the chroma issue - Apple doesn’t support EDID correctly so fonts look even more terrible on some screens. A google search will confirm these problems.


This is an interesting position. I have always thought that font and font rendering were always an especially pernicious issue with Linux and a relative joy on MacOS?


As with most things Apple, it is a joy as long as you restrict yourself to only plugging the device into official Apple peripherals, preferably ones that are available to buy right now. It’s when you start hooking your Mac up to old hardware or random commodity hardware that the problems surface.


I think that is a historical artifact. Ubuntu had a set of patches for freetype called infinality, developed around mid 2010, which dramatically improved font rendering. Since then, most of those improvements have been adopted and improved in upstream. [1] Any relatively modern Linux desktop should have very good font rendering.

[1]: https://www.freetype.org/freetype2/docs/subpixel-hinting.htm...


Except for the hidpi inconsistency across apps and window managers


I recently started using Linux some on the same 4K monitor I usually have my Mac connected to. I was shocked at how much sharper and easier to read the text was on Linux.


It's pretty straight forward re-enabling sub pixel rendering.


I actually agree that most modern Linux DEs has better rendering. Especially since you can configure how much hinting and aliasing you want.


I have been using a 4k monitor and 2 1080p monitors on linux for a while now. The current state of things is that hidpi works correctly on everything I have run including proprietary apps. I'm also surprised when my wine programs scale properly as well.

What does not work perfect is mixing hidpi and lowdpi screens. On wayland with wayland compatible apps it works fine but on X11 or with xwayland apps like electron it will not scale properly when you move the window to the other screen, it will scale to one screen and be wrong when moved over. Overall I don't find this to be too much of an issue and when chrome gets proper wayland support the problem will be 99% solved.


I can confirm this anecdata. Single and dual 27" 4k is fine, but mixing with a 27" 1440p is messy (tried with GNOME and KDE on Manjaro during early Corona home office).


> I have been using a 4k monitor and 2 1080p monitors on linux for a while now. The current state of things is that hidpi works correctly on everything I have run including proprietary apps.

It's good to hear things aren't as bad as some have suggested.


I think it was bad since when I look through the issue trackers a lot of hidipi bugs were closed less than one year ago but I have not really noticed much other than what I noted about multi monitors.


Agree that I've never had DPI scaling issues with macOS. But Windows? All the damn time! They still haven't got that figured out.


> If there are apps on MacOS that can't handle high dpi mode, I haven't run into them as a developer (or doing photo editing, video editing, plus whatever other hobbies I do). Also, I don't have any trouble with plugging my highDPI MacBook into a crappy 1080p display at work.

PyCharm had high CPU consumption issues with a Macbook connected to a 4k display running in a scaled resolution. Native 4k was fine, using the default resolution was fine but "more space" made it use tons of CPU since it had to rescale text and UI elements on the CPU.


I think all the jetbrains tools did. I remember a few years ago there was some big switch over (high Sierra maybe?) and the jet brains tool fonts were janky for a while - something to do with the bundled jvm and font rendering. I think it's sorted now, but there was an 'issue' there for a while. Maybe it's still an issue in some configurations?


Still an issue - the scale factor is 2.0 or 1.0 by default depending on whether the display is HiDPI/Retina or not.

There is an option to change the scale manually (-Dsun.java2d.uiScale=<floating point value> in the JVM rags), but I don’t know if it helps or not.


1 is definitely an issue for me.

It's why I actively avoid monitors with small pixels. My trusty old Dell U3011 and the two rotated 1600x1200's flanking it suit me just fine.

I've no inclination to change my OS, either, just for the sake of fonts.

Kudos to those commenters still on CRT displays. One complaint I have with LCD's is reset lag time, which can make it tricky to catch early BIOS messages.


> BIOS messages

I think the problem is right there. You’re living in the past so you get the past’s problem.

I haven’t seen a BIOS message in a decade.

I’m sure those U3011 are beautiful and hard to let go (I used to have a U27) but it’s also hard to look at a low-dpi screen once you’ve seen the light


Well, my machine is about a decade old. (Many upgrades along the way, though, and holds its own just fine)

BTW what's wrong with BIOS or UEFI messages? I much prefer them to cosmetic boot graphics.


Modern computers boot so fast that you can't really see those.


Sure. The fastest I've seen lately is that ARMv7-A board submitted the other day which boots in 0.37 seconds, or with networking, 2.2 seconds. That's time to user land, and to achieve it took highly specialized firmware and a stripped down kernel compiled with unusual options. I've yet to see a PC come anywhere close to that, and personally I won't consider the boot problem solved until a cold one completes quicker than I can turn on a lightbulb.

In fact historically, some of the higher-end hardware yielding the best performance during operation (an axis along which I optimize) actually added time to the boot sequence. The storage subsystem on my workstation is backed by a mix of four Intel enterprise-grade SSD's in RAID-0 (raw speed) and 8 big spinning platters in RAID-6 (capacity), plugged into an Areca 1882ix RAID card w/ 4GB dedicated BBU cache. Unfortunately that card adds a non-bypassable 30 seconds to the boot sequence, no matter what system you plug it into. But once there, it screams. It's only just the last couple years PCI-NVMe drives have come out that can match (or finally beat) the performance metrics I've been hitting for ages.

So I actually kind of feel like I've been living in the future, and the rest of the world just caught up ;-).


If there are apps on MacOS that can't handle high dpi mode, I haven't run into them as a developer (or doing photo editing, video editing, plus whatever other hobbies I do).

Same. I haven’t run into any apps that don’t support high dpi mode. Even terminal apps look great on my Retina 4k iMac screen.

Before getting this machine nearly a year ago, I couldn't natively view high dpi graphics for web projects I’d work on, which was a problem since there are billions of high dpi devices out there.

Added bonus: wide gamut is a thing and there's increasing support for this in browsers, including new color spaces: https://lea.verou.me/2020/04/lch-colors-in-css-what-why-and-...


> If there are apps on MacOS that can't handle high dpi mode, I haven't run into them as a developer (or doing photo editing, video editing, plus whatever other hobbies I do).

The version of pgAdmin not based on electron had pixelated fonts on a 5k iMac. I haven't checked recently.


> If there are apps on MacOS that can't handle high dpi mode, I haven't run into them

Audacity has extremely low framerates and chugging when interacting with the waveform on retina screens. Even running in low DPI mode doesn't fix it. Only runs nice on non-retina displays.


Thank you for posting critical feedback. People have a tendency to defend their choices, suppressing these kinds of comments, even when valid.


Agreed. I have a retina mbp that I use as the 3rd screen plugged into a 4K monitor over usbc and a 1920x1200 over dp. Everything works fine. Windows stay in the right place when plugging/unplugging, etc... My eyes also thank me every time I look at the 4K text on my main monitor. I’m debating buying another one and dumping the 1920 monitor.


My MBP doesnt scale properly on my ultra wide 3840x1080. The whole OS makes things too big.


Hmm, may be you haven't but I have.


There are platforms aside from MacOS, and developers need to use them. It's great that you can, but that's not everyone.


> There are platforms aside from MacOS

The post I quoted was: "No matter what operating system you're on". I quoted that particular bit for a reason.

The ergonomics of better screen resolution don't change just because your OS isn't good at dealing with high resolution.

When you pick an operating system (if you have the choice), there are a lot of factors, it's good to know what's important to you and choose appropriately.


That was a direct reply to the “No matter what operating system you're on” quote. The fact that there are some operating systems that have issues doesn’t make it true that all do.


> If there are apps on MacOS that can't handle high dpi mode, I haven't run into them

It's almost as if there's more operating systems than MacOS...


FWIW, my main monitor is a 43" 4k display, and it works perfectly fine on AwesomeWM - but I' don't use any scaling, the 4k is purely for more screen real estate - literally like having 4 perfectly aligned borderless 24" monitors. I can fit 10 full A4 pages of text simultaneously.


I recently upgraded to a 43” 4k monitor and use it the way you describe. I’m not sure I am happy with it. The real estate is nice but it might be too much. UI elements end up very far away. I rarely need all that space.

I either need a bigger (deeper) desk to sit back farther or just a smaller monitor physically with the same resolution.


What I found helped when I moved to a large 4k screen is when I stopped trying to eke out every last bit of space and started using my desktop as an actual desktop analogue again. Whereas I used to full screen, and snap to half or quarters of the screen, now I have a few core apps open that take up roughly a quarter, usually a browser and email, and other apps I open up and move around to organize as I feel the particular task warrants (generally some terminals that are all fully visible). I often drag the current thing I'm working on to the bottom half of. The screen so it's slightly closer and easier to see directly, and leave reference items or stuff I'm planning on revisiting shortly up top.

I also was thinking I didn't particularly like the large desktop and screen at first, but now that I treat it as a combined wallboard and desk space, I can't imagine going back (and using the large but not quite 4k monitor at my desk at the office always felt like a step backwards).

I do set default scaling for Firefox and Thunderbird to be about 125% of normal though, as I don't like squinting at small text. I generally like how small all the other OS widgets are though, so I don't scale the whole desktop.


Thanks for the input. I’m about a week in and that is the realization I am reaching. My window sizes are almost back to where they were before.

It has been an iterative process but I’m getting a handle on it.

I have a sit/stand desk, I’m considering moving my recliner in to the office and elevating the monitor. Working from a reclined position seems ideal but I also thought a 43” monitor was a good idea...


If you're on Windows, consider the fancyzones power toy. Break up your screens into an arbitrary set of maximizable zones. Highly recommend if the normal 4 corners isn't enough for you.

https://github.com/microsoft/PowerToys/tree/master/src/modul...


I'm on MacOS. I basically never maximize anything. I just resize the floating windows and put them where they make sense at the time. The dream would be doing this with the keyboard using something like i3. I have been tinkering with that on weekends but M-F I focus on paying work.


Check out Moom. You can save window layouts and replay, save window positions and apply them to any app, arbitrarily move, maximize to a range with a customizable margin, etc etc. https://manytricks.com/moom/


Does anyone know if there is something similar for Linux?


Depending on how exact of a similarity you're looking for, gTile for GNOME/Cinnamon might be of interest to you. I've also found PaperWM to be very productive.


I use gTile for gnome. I had to play around to find a setup that I like. I eventually settled on ignoring most of the features that were offered out of box. Now I have configured a few simple keyboard shortcuts.

For example, Super + Up Arrow will move a window into the central third section of the display. Pressing it again will expand the window a little bit.

Nice and simple. Makes working with large displays pleasant.


I use Rectangle. I’ve configured it with a few keyboard shortcuts that let me move a window into specific regions on the display. I use it to quickly have multiple non-overlapping windows.

I cannot imagine using a large display without it!


BetterTouchTool is _excellent_ for window placement/resizing; also can be triggered externally if you want to combine it with Alfred for Karabiner Elements (though you don't have to, you can define the triggers in BTT itself).


The 4K monitor at my office is a fair bit smaller than that, and I haven't used it since the COVID-19 epidemic sent me packing home. Since then I've just been using my laptop's built-in screen.

To be honest, I think I may stick to it. At first, the huge monitor was fun, and initial change to having less screen real estate was definitely a drag. But, now that I'm accustomed to it again, I'm finding that "I can fit less stuff on the screen at once" is just another way of saying, "it's harder to distract myself with extra stuff on the screen." My productivity is possibly up, and certainly no worse.


A major pain (literally) point for me with laptop screens is posture. My neck aches after a day of looking mostly down. I suppose an external keyboard and mouse would help but I would have to get a stand and blah blah.

Also for my particular workload real estate is very handy. I totally agree with there being some virtue to constraints but several times a day I really need the space.


That's a fair point. I do place my laptop on a stand and use an external keyboard and mouse, even when I'm working from home.


I think this really depends on the work you do, also. Pure development or content creation and I'm good with just a laptop. For research, with team communication, concurrent terminal sessions, debugging, management - I really do want at least 3 screens.


This

Pretty much no maintstream UX has properly solve the large screen problem and instead we have to workaround it with smaller, multiple monitors.


32” 4K is perfection, right in the sweet spot of size.


This. After doing some research, as far as I can tell a 32" 4K maximizes the amount of content you can see at one time within a comfortable viewing angle and without needing scaling to make text readable.

At typical desk monitor distances you shouldn't be able to see distinct pixels anyway.


This is the correct answer.


Agreed. My work monitor is a 32 or 34" ultrawide. It works well but I would really like more vertical real estate. I'm definitely shopping for 32-34" 4k displays right now.


3840x2880 34" would be so nice.


I've 32" @ 3840x2160, and it's a bless, I do coding and product design, my best tech acquisition in the last few years: https://www.benq.eu/en-eu/monitor/designer/pd3200u.html


That looks awesome. Pretty expensive from what I see after a quick search though. Does it have USB-C charging?

My previous monitor iterations have been LGs and I really like the USB-C charging capability, KVM and the OSD/buttons.

Got one of these in my cart right now: https://smile.amazon.com/dp/B07NS7JKJH

I'm gonna sleep on it and double check the return policy on this 43".


Looks nice also, but I already have a 4k 32" Eizo Flexscan that I'm happy with - I'm after the 4:3 aspect ratio, everything just seems to go wider and wider these days.


Vertical is the reason I haven't upgraded from dual 1920x1200 (but of the time single anyway). Although I'm looking at 1440p mainly for 120hz+. (1440p at 30+")


43" 4k masterrace here - I beg to disagree.


43" seems rather large--how far away do you sit? If it were as close as a more "normal" sized monitor (~2-3 feet), wouldn't you be craning your neck all day trying to see different parts of the screen?


Nah. I have a 49" curved 1440p monitor. Things you look at less often go to the sides. You can fit 4 reasonable sized windows side by side. Code editor holds over 100 columns at a comfortable font size for me 40 year old eyes. It's the best monitor setup I have ever had. You can spend less and get the exact same real estate with two 27" 1440p monitors. Either way, it is a fine amount of real estate and not at all cumbersome for all day use in my case.


I am getting the same Dell 4919DW monitor, transitioning from two 25" Dell monitors. I think the built in KVM will be great addition as I have two workstations. Ordered the new Dell 7750 to pair with a WD19DC docking station. I hope the Intel 630 UHD built in graphics will do, as stated in the knowledge base. The 4919DW only have 60hz refresh rate but I am not concerned about that. A great alternative would be the curved Samsung 49" C49RG9 at 120hz.


Which model is your 49"?


https://www.dell.com/en-us/work/shop/dell-ultrasharp-49-curv...

It’s quite a piece of hardware. It lets me plug a USB hub into it. It supports USB-C for its display adapter. So it’s the dream: one USB-C cable to charge the laptop, drive the display, and provide a USB hub for keyboard, mouse, etc.


What kind of workstation are you using it with?


I'm in the same boat. More real estate is the big win. I made a pandemic purchase of a TCL 43" 4k TV to use as a monitor primarily for programming. I sit a bit further from it: 30" rather than 24ish when working on the laptop. I drive it with a 2019 inexpensive Acer laptop running Ubuntu 20.04 and xfce. Every so often an update kills xWindows, but I can start it in safe mode and get things working.

I do find my head is on a swivel comparatively, but while noticeable without being a negative. Overall I like it. A lot. The only thing that is painful is sharing the desktop over Webex/skype. That does bog the system down and requires manual resizing of font size to inflate it so that viewers on lower resolution systems can cope with it.


I am somewhere in between. I don't go for hi-DPI but am using 28" 4K on the desktop and 14" 1080p on my laptops. So identical dot pitch and scaling settings. I just have more display area for more windows, exactly as you say like a 2x2 seamless array of screens.

I actually evolved my office setup from dual 24" 1920x1200 and went to dual 28" 4K. But with the COVID lockdown, I only have one of the same spec monitor at home for several months, and realize that I barely miss the second monitor. I was probably only using 1.25 monitors in practice as the real estate is vast.

People who complain that a monitor is too large should stop opening a single window full-screen and discover what it is like to have a windowing system...


I have found DisplayFusion and now PowerToys' FancyZones to be indispensable on Windows desktops with a ton of real estate.


In the same boat here. I use a $400 49 inch curved 4k TV as my monitor along with i3wm and while I waste a lot of the space on screen to perpetually open apps I don't touch, having the ability to look at my todo list or every app I need for a project at the same time has its benefits. I just wish I could lower the height and tilt the TV upwards a bit so I'm not breaking my neck looking at the upper windows.


I had a similar setup at a previous job -- one of the early 39" TVs. It could only drive 4k at 30Hz, but for staring at text, nothing could beat it. It takes a good tiling window manager to get the most out of this setup. By the same token, a good tiling WM also makes a tiny little netbook screen feel much bigger. So I guess what I'm really saying is, use a tiling WM!


43" 4k is approximately 100dpi, like 21" 2k. It seems like a reasonable form factor to me (at 1x), but there aren't many of them that do high refresh rate, and they're all very expensive.


Haven't you effectively escaped the super hi-dpi issues of parent? I think he is referring to use of smaller screens at 4k.

Interesting idea thought just effectively having a massive monitor.


Which display in particular are you recommending, and what is the latency like?


I've only seriously tested the Dell P4317Q that I have in the office. Others have had good success with small 4k TVs. Can't say I've noticed anything about the latency, but I've never gamed or watched movies on it, so IDK


I use Samsung 4k TV's (55in and 43in) at work and home and the experience is absolutely fantastic. In game mode the latency is reported to be 11ms and there's no difference visible to me compared to 60hz computer monitors.


Do you have the specific model numbers? I'm buying a new monitor soon, and considering using a 4K TV. Would be great to check out the ones you're using!


They're both ru7100. It may be called something else depending on where you are / may be replaced with newer models.

It basically works out as 4 seamless 1080p monitors, and I use i3 for a 3x2 window layout


If you are on macOS, all is good. Never had a problem with any of my 4 monitors (3x4K, 1x5K). I set the scaling to a size I like, and the text is super crisp. I don't see how any programmer can NOT like that.


How do you manage multiple monitors with MacOS? I was doing this until recently and every single login involved rearranging my windows because MacOS moves them all to whatever display woke up first.

In my experience MacOS multi-monitor support is effectively non-existent.

Recently I picked up a 49” ultra-ultra wide monitor (basically 2x27” panels). It is one monitor but MacOS can’t drive it. They just don’t detect that resolution. I switched to a 43” 4k monitor (technically more pixels) and MacOS drives it fine.

My experience with MacOS is not “it just works” unless you are doing something Apple already predicted. That’s fine for me, I just wish they still sold a reasonable monitor themselves so I could be assured it would work properly.


How do you manage multiple monitors with MacOS? I was doing this until recently and every single login involved rearranging my windows because MacOS moves them all to whatever display woke up first.

I finally got sick of this and wrote a Hammerspoon script to deal with this. The config looks like this:

      local dualLayout = {
          {"Firefox",       nil,         primary,   topLeft,             nil, nil},
          {"Preview",       nil,         primary,   topLeft,             nil, nil},
          {"Sublime Text",  nil,         secondary, topHalf,             nil, nil},
          {"Slack",         nil,         secondary, topRight,            nil, nil},
          {"Notes",         nil,         secondary, topRight,            nil, nil},
          {"iTerm2",        nil,         secondary, topRight,            nil, nil},
          {"Safari",        nil,         secondary, topRight,            nil, nil},
          {"IntelliJ IDEA", "cursive",   primary,   hs.layout.maximized, nil, nil},
          {"IntelliJ IDEA", "community", secondary, hs.layout.maximized, nil, nil},
      }
Now when I connect my two external monitors, a quick Ctrl-Alt-Cmd-R lays everything out. It's crap that I have to, but it saved my sanity.


This is really cool, thanks.

I tend to have one space per task and have at least a Firefox and an iTerm on every one.

Just running a single monitor is the way to go for me. The benefits of multiple monitors just doesn't make up for all the fiddling.


> every single login involved rearranging my windows because MacOS moves them all to whatever display woke up first

Maybe we can get to the bottom of this. What is your use case?

I ask because as long as I plug them into the same ports it remembers how I arranged them previously (2018 macbook pro 15"). I haven't had to arrange them in over a year... even remembered when updating to latest operating system. Occasionally, I even plug in my LCD TV as a third external monitor and it remembers where that one should go in the arrangement too.


MacOS cannot drive one 5120x1440 display using Intel display hardware. It will happily drive two displays at 2560x1440. The monitor had multiple inputs so by putting it in PBP mode I was able to drive one input as USB-C and another as HDMI through a dock converter. This means the wakeup was not in sync. MacOS would see one monitor, arrange everything on that then realize there was a second one and fail to move anything back in this "new" arrangement.

The fact that it was all one physical monitor may have further confused the OS as a sibling comment mentions.

The solution was to sell the monitor to a Windows-using architect friend and buy a different panel with a resolution MacOS supports. She has a macbook too but it's the fancy one with discrete graphics which can drive 5120x1440.

The value proposition of MacOS to me is that I plug things in and they work. Any fiddling beyond that destroys the benefits of using this platform. I'm willing to iterate on hardware until I find something that works.


> MacOS cannot drive one 5120x1440 display using Intel display hardware.

For other readers, this is not technically correct. The 2020 13” MacBook Pro can drive the Pro Display XDR with its integrated Intel graphics.


I do not have a 2020 MacBook so I cannot test but the Pro Display XDR is not 5120x1440, it is 6016x3384. The problem with my current MacBooks ('14 15" RMBP and '17 13" MBP, both with Intel Iris graphics) is that while they can drive 4k displays they cannot drive the 5120x1440 resolution specifically.

This limitation is specific to the MacOS drivers. Windows in Bootcamp is able to drive 5120x1440 on these devices.


It's possible it is just MacOS doesn't have the EID information.

https://apple.stackexchange.com/a/221498 and https://apple.stackexchange.com/a/233854 has how to fix this. It doesn't sound much fun though.


Yeah I read through all those. It’s a work laptop so I’m not comfortable doing things like disabling SIP or mucking around in any system settings. That machine is my livelihood so I don’t mind finding devices that just work.


My 2018 13-inch MBP is driving a 5120x2160 LG 34WK95U-W just fine, over Thunderbolt 3.


Mine too. That’s a completely different monitor and resolution. The problem is specifically with 5120x1440.


Ah ok, ya maybe it's related to it being the same monitor.

I have two different monitors that wake up at very different speeds and it's no problem here. My 15" 2013 and 2015 macbook pros had no problem with this either, and I've had 4 different monitors in the mix through those years too. I've transitioned to a CalDigit Thunderbolt 3 dock now and still no problem with it remembering.

So there's definitely something unique about that monitor. That is sad news for me too -- I'm hoping they make a 2x4K ultra wide monitor like that someday. Hopefully they've solved this problem by then.


So my 2018 MacBook Air can drive the Apple LG 5K display (lid closed) at 5120 x 2880 at 60Hz.

https://support.apple.com/en-us/HT210205

I currently have the Apple LG 4K but plan to upgrade to the 5K.

I agree with you on the just works thing. I have the same keyboard, trackball, monitor, tbolt dock and external HD on my desk at home and work.


It helped me to check "Displays have separate spaces" in the Mission Control panel. MacOS seems to remember what went where with this checked.


That might work but it breaks my workflow in another way. Physically the display is a single panel. I organize workspaces by task so changing to a new one needs to change "both" panels because I'm actually using them as one.


>How do you manage multiple monitors with MacOS? I was doing this until recently and every single login involved rearranging my windows because MacOS moves them all to whatever display woke up first.

At least for apps that are dedicated to one screen + virtual desktop, right click its icon in the dock and assign it to that display and workspace.

Note that the effectiveness of window restoration also depends on the make/model of your monitors – many manufacturers incorrectly share EDID's across all units of the same model and sometimes across multiple models, making it much more difficult for operating systems to uniquely identify them.


That used to happen occasionally to me as well in earlier macOS versions. Didn't have to do any rearranging since Mojave, I think, definitely not on Catalina.


I use a single 34" 4K monitor with an arm mount on my Mac Mini. The power button on this monitor is one of those touch sensitive ones on the bottom right that I sometimes accidentally brush past. When I switch it back on, every single window gets reduced into an incoherent size and everything gets moved to the top left. It's really annoying.

I'm thinking of flipping my monitor upside down so I'll never accidentally brush that area while picking up something on the table.


That's likely a firmware bug in the monitor. It probably reports some ridiculously small resolution during its boot process and macOS queries it that time and rearranges the windows accordingly. macOS could implement workarounds of course, but probably it just follows the process whatever the display id protocol prescribes...


Have you tried https://cordlessdog.com/stay/ ? Was working relatively well on my macbook before I switched to an iMac


I was able to use a 5120x1440 resolution on my MacBook Pro using SwitchResX. No idea why Mac OS doesn’t let you select that resolution.


What kind of MacBook do you have exactly? Year, size, graphics hardware and OS.

Reports I read stated that while you can select it with SwitchResX it was scaled.

I never tried installing it myself because I’m not a fan of modifying the system on a Mac, especially one I don’t own.

From my poking around I think the horizontal resolution is the problem. The system scans possible resolutions to see what works. Apple just never expected a single display that wide.

There’s some reports that newer MacBooks with discrete graphics on Catalina can indeed run this resolution. It used to not work regardless of hardware, now apparently discrete graphics MacBooks can run it. Maybe because they updated the drivers/system for their new super fancy monitors.


You probably have to update the monitor info DB on MacOS. See https://apple.stackexchange.com/a/233854


Text looks similar to Windows and Linux (both of which definitely run at native resolution).

macOS Mojave 10.14.6

MacBook Pro 15” 2019

Radeon Pro 555X

Intel UHD 630


Right, you have discrete graphics. I do not which is why I am stuck with the MacOS driver bug for Intel display hardware.

I’ve seen zero reports of anyone successfully running 5120x1440 using Intel graphics.


To clarify: I mean I have no idea why Mac OS doesn't let ME select that resolution while SwitchResX does.


Go to the Displays panel, switch to the "default for display" option, then switch back to "scaled" while holding down the option key. Do you see that resolution in the list of options?


I’m not sure what SwitchResX actually does. None of the accounts I have seen online are thorough enough to draw any conclusion.

As far as I can tell it is the same as Option-clicking on scaled in the display preferences.


> scaling

I was plagued with performance problems, mostly in IntelliJ, until I realized it didn't work well with scaling. Going to native resolution solved it.


I’d say all is bad with MacOS and external monitors... It can’t manage text scaling like Windows, so you either have to downscale resolution and get everything blurry or keep the ridiculously high native resolution and have everything tiny :(


Is it not visible for you in the displays settings? You DO need all the monitors to have the same DPI or you’d have a window rendered half in one dpi and half in another when dragging across a display boundary.


No, when it’s an external screen I don’t get any scaling options, only the choice of resolution. I have a 24” QHD, so either it’s ridiculously small 2500xSomething or it’s blurry HD :(


I have this problem as well. I actually run my 27-inch 4K screens downscaled on MacOS because the tiny font at native-4K gives me a headache.

The worst thing about it is that scaling seems to use more CPU than running natively and the OS has some noticeably additional latency when running scaled.


Odd, my main setup is an external 4k monitor and I only use it with the “large text” text scaling and I have no complaints, the text is clear and large and easy to read. Perhaps you’re also using your laptop screen as well?


No I close the laptop. Maybe it has to be a 4K monitor? Or an Apple monitor


Text honestly looks like shit an any non 4k external monitor for macOS it's kind of crazy how bad it is compared to windows.


Huh.

At work I have a mac mini and a Windows box, and I use three crapola Asus monitors between them, and my impression has been that macOS does a better job rendering text on said crapola monitors (the Windows box does a better job at compiling C++ in a timely fashion, though, so I mostly work on that one).


It's just a different stylistic choice. A lot of font nerds prefer the OSX choices because they try to stay true to the original font spacing without regard to the pixel grid.


Missing sub-pixel antialiasing is plain technical deficiency, not a stylistic choice. I agree arguments can be had about hinting and aligning the glyphs to the pixel grid, but not much beyond that.


It's still there, you just have to go turn it on.

But yes, I didn't know that they ripped out subpixel rendering in late 2018 by default.


Yeah they didn't completely remove it, but they did a good job of hiding it by not making it an option to turn on in the GUI. Have to use a terminal command to enable it: https://apple.stackexchange.com/a/337871

In general with a HiDPI screen I don't find any need for it. But on a low-res display like the typical 24" 1080P models it certainly helps.


Starting with Catalina, it does not fix the issue. I had to send back a Macbook Pro 16 2019 because of that and go back to Linux.


I've been looking for this for so long. Can't wait to try it out tomorrow


Completely agree! Went from a mediocre 2x1440p to high quality 2 x 4K, then back to a pair of equal quality 2x1440p.

I would also add, when it comes to 4K and, for example, MacBooks, things fall apart quickly in my opinion. Cables, adapters/dongles/docking stations just must match up for everything to work in proper 60fps, and it gets worse if you have two external displays.

As for my home set up, also stayed at 25" 1440p. Nice balance for work, hobby and occasional gaming without braking the bank for a top-tier GPU.


>I would also add, when it comes to 4K and, for example, MacBooks, things fall apart quickly in my opinion. Cables, adapters/dongles/docking stations just must match up for everything to work in proper 60fps, and it gets worse if you have two external displays.

I agree it's a bit of a mess, but USB-C monitors solve all those issues. I just plug my MacBook in with USB-C, and instantly my 4K (60 Hz) display is connected, along with external sound and any USB peripherals. No fussing with a million different cables and adapters. It's the docking workstation setup I've dreamed of for a decade.


It doesn’t solve all of the issues. The USB-C port can support DisplayPort 1.2 or 1.4 bandwidths and you have to make sure it matches up for some high-resolution monitors to work.


By equal quality 2x1440p monitors are we talking Eizo quality or Dell UltraSharp quality?


Dell UltraSharp were the 4K and now 1440p. The previous ones were Asus.


3. Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming

This seems like a long-standing problem with Linux, rather than a reason to dislike high-res screens.


Why not both? If I'm on linux, with no interest in changing and perfectly happy with my display, and 4k doesn't work easily on my system, why would I be interested in a 4k screen?


Strange. I'm not seeing any issues with Linux and 4k. I'm running a plain Debian 10 with OpenBox running on 4x 4k (3x 28" in a row and one 11" 4k under the right-most) monitors though, granted, I only normally have one web browser that follows me around, across work spaces pinned to the right monitor, mostly maximized Sublime on the middle monitor and a pile pile of alacritty/xterm windows on the left-most monitor. The small monitor which content also follows me around contains clipboard, clocks, Slack and monitoring.

What is the software that people are using that creates problems?


So far, I've never had an issue with KDE Plasma and 4K@60Hz on linux, once I realized that you can't just use any old HDMI cable: you need DisplayPort or HDMI2


Its a problem if you have a nvidia gpu (forcing you on x11) and have 2 different dpi screens. On AMD/Intel/Nvidia open source drivers it works fine.


Yeah, I’ve always picked my GPU to work well on Linux, so I have an AMD one.


FWIW, switching between resolutions in my favorite desktop environment, Xfce, is two steps:

  # This affects every GTK app. 
  xfconf-query -c xsettings -p /Xft/DPI -s 144
The second step is going to about:config in Firefox, and setting layout.css.devPixelsPerPx to a higher value than 1.0. I really need to write an extension to do that in one click.

What is really tricky, though, it's having two monitors with different DPI. Win 10 does an acceptable job with it; no Linux tools I'm aware of can handle it reasonably well. Some xrandr incantations can offer partial solutions.


Even Win10 struggles when you move windows between different DPI domains. Apps will slide in HUGE or tiny until you get past the midway point. And when the system goes to sleep everything can go to hell. You can come back to small message windows being blown up to huge sizes or windows crushed down to a tiny square. You can forget about laying out your icons perfectly on your desktop too, they'll get rearranged all the time. Even more fun when you remote into a high DPI display with a low DPI display. It actually works pretty well, but stuff will get shrunk or blown up randomly when you to back to the high DPI display.


>Apps will slide in HUGE or tiny until you get past the midway point.

This is a non issue to me. I don't use apps half way across 2 screens. Its just a minor bit of weirdness while dragging them.


Logically having your OS maintain a consistent UI size makes sense until you try it without.

I'm running a couple medium high density monitors alongside one of the highest density ones available. I don't scale the HIDPI monitor at all, which means when I drag windows to it they are tiny. Instead it works in two ways, as a status screen for activity monitors/etc and as a text/document editing screen. AKA putting adobe acrobat, firefox or sublime/emacs/etc on the high DPI screen and then zooming in gives all the font smoothing/etc advantages of high DPI without needing OS support.

So the TLDR is, turn off dpi scaling, and leave the hidpi screen as a dedicated text editor/etc with the font sizes bumped to a comfortable size. Bonus here is that the additional effort of clicking the menu/etc will encourage learning keyboard shortcuts.


Check out user.js, it's a built-in feature. You can declare which config values you want under Firefox profile folder.


> Person who actively dislikes 4k here

I don't think the reasons you illustrated support that conclusion. You don't actively dislike the extra pixel density of a 4K display. You seem to only dislike the compatibility issues relevant to your use case.

>No matter what operating system you're on, you'll eventually run into an application that doesn't render in high dpi mode.

FWIW, I can't recall the last time I has a problem with apps not rendering correctly in hidpi mode on MacOS. Unless you've got a very specific legacy app that you rely on for regular use it's a non-issue.

>Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming

Ah, I think I found the real issue ;-) If your linux desktop rendered 4K beautifully, seamlessly, and without any scaling issues right out of the box, I could all but guarantee that your opinion would be different.


Thanks for sharing your experience!

> My argument in favor of 1080p is that I find text to just be... completely readable.

Yes, me too. To me it's more than just readable, I find the text crisp and comfortable. I don't need anything else.


You know, I was in complete agreement with the article and I was considering a 1440p monitor or something until I saw your comment and reflected on it. The most productive periods/jobs I can remember were on i3wm with a Goodwill 900p/19" monitor and a 20" iMac 10 years old at the time. But it's because I had access to good tools then like Neovim/Atom respectively. My work now requires an RDP Notepad.exe session so there's no monitor that will help me there. I guess software tools are way more important.


Is there no way to sync files with the remote environment so that you don't have to actually use notepad.exe


I have a 49" curved monitor. It is effectively two 27" 1440p monitors stapled together (5120x1440). It is the best monitor I have ever had. 1440p has a very decent [higher than typical] pixel density but is not "retina". Fonts look pretty smooth, but you can still see pixels if you try really hard. Overall, I do think high density screens look amazing, but the software has not quite caught up to them. The benefits are on the softer side, and if I could just have magical mega-high-DPI displays with no side effects, sure why not? As it stands, 49" curved monitor is pretty fine. It fits four windows side by side at reasonable resolutions.

Primary apps go in the middle, such as code editor, etc.. Tertiary windows, such as documentation go on the outer edges. Still quite usable, but a little out of the way for extended reading.


Hey, do you mind sharing more info. on how to get the monitor? I'm looking to invest in a curved one since it's an experience I've never had. And are there retina models out there, or is it not worth it, in your view?


https://www.dell.com/en-us/work/shop/dell-ultrasharp-49-curv...

I like Retina DPI, but this is good enough. It’s not very curved, but it has some curvature. I don’t think it’s worth it. 1440p is good enough for me.


I couldn’t agree more with this. A 49” 5120x1440 curved monitor is brilliant for productivity. It’s better than two or three separate monitors. I do miss high DPI but I wouldn’t trade this type of monitor for the current batch of smaller high DPI ones.

There’s only two or three things that would make this better. A high DPI variant, more vertical space and a greater refresh rate. Given those two things, I think that’s the endgame for monitors (in a productivity context).

(I think that’s 8x the bandwidth so it’s a while away!)


I have a 49” curved too! But it’s a UHD TV so it’s four 24” 1080p screen in one! Three/two portrait windows is great for reading, reducing scrolling.


OS X handles hidpi perfectly. Never had an app that didn’t display as it should.

I do agree with you that 4K under 27” isn’t necessary.


I have a 4k 27" monitor and I have had to run it in 1440p recently because its all my dell xps can manage and its usable but a noticeable downgrade. My 24" 1440p monitor at work looks perfectly fine though.


For #3, the creator of the i3 window manager is using the 8k monitor mentioned in the article I am pretty sure this is possible. https://michael.stapelberg.ch/posts/2017-12-11-dell-up3218k/


> 3. Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming. I use i3 and it adds way more productivity to my workflow than "My fonts are almost imperceptively sharper" ever could

What issues actually? Xmonad on Arch user here and I find the sweet spot for me is 27-32" 4k, 1440p on laptop (I guess 4k would be nice here too, but not sure if it motivates the increased power draw). After getting used to the increased real-estate, I do feel limited on my older 1080p laptop screen; fonts smaller than 8p (which is still all good) noticably impacts the readability to the point I can feel my eyes strain faster. It did take a bit of playing around with Dpi settings to get it right, though, out of the box it's not great. The Arch wiki has some great material.

The only frustration I do have (which IS super-frustrating, specifically for web browsing) is with multi-monitor setups with different pixel density - your point 2, I guess. Even plugging anything larger than 19" with 1080p into my 1080p Thinkpad is annoying.

I think it should be possible to configure it correctly but I just gave up and end up zooming in/out whenever I do this and send windows between screens. Haven't looked at it, but maybe a mature DE like KDE or GNOME (which, if you don't know, you can still use with i3) should be able to take care of this.

Also, this is all on X11, have no idea if and how wayland differs.


4K on a desktop is just sorta silly. Now.... 1440p is a very useful bump over 1080.


My day job involves untangling SQL that was written under fire. I consume more spaghetti than a pre-covid Olive Garden. Every vertical pixel is precious for grokking what some sub query is doing in the context of the full statement.


Have you rotated your monitor 90 degrees to portrait already?


I used to when I ran dual 24" monitors! We had really sweet old IBM monitor stands that could tilt, raise and rotate. You had to be quick to grab them from the copy room before the electronics pickup. I swiped one from the still warm desk of a colleague on the way to their farewell happy hour.

So I had a '14 13"(? Might have been 15" but probably not) RMBP on the left with email and chat stuff, 24" main monitor in landscape and then another 24" monitor in portrait on the right. I think at some point I put a newer panel on the IBM mount. It was sweet.

Back then we still had desktop PCs at our actual desks so there was a KVM on the main monitor! What a time to be alive!

These days I do prefer a single monitor workflow if possible. It's just cleaner and move convenient.


Being used to MacBooks with retina screens, 4K on at 30” is perfect to me as “retina”. Anything larger needs to be 5k or 6k. 1440p is passable on <24”.


You still need to scale the resolution to make it easily readable on most common monitor sizes like 27". The end result is really good and sharp.


27" x 1440p has been my go-to for a while now. Works well without scaling between win/mac/linux, does not dominate the desk completely, high quality monitors are readily available in this resolution etc etc.


I think my dream set up would be a 27" 1440P (What I currently have at home, with a pair of smaller (19" maybe?) 1080p screens on either side setup in portrait. Basically a similar screen area to 2x27", but without a bezel right in the center of my field of view, and the 1080x1920 screens will be a good size for displaying full page (e.g. PDF) at more or less full screen.


At that point not just run an Ultrawide at 1440p?


I actually like having multiple screens. I know I'm weird in this but I actually like running certain apps maximized.. but I wouldn't want it maximized across a whole ultrawide.

Plus, if I take a working vacation somewhere it's a lot more practical to schlep around one 24" or 27" than an ultrawide.

Also, just as an ergonomic thing, I could angle in the two outer screens a bit while not having one of those icky curved screens.


The price is probably double.


>My setup is 2x24" 1920x1200 monitors - so I get slightly more vertical pixels than true 1080p, but in the form of screen real estate rather than improved density.

I'm working on an old 24"16/10 display (the venerable ProLite B2403WS) and an OK 32" 4K display with a VA panel. Both are properly calibrated.

There is no amount of tinkering that can make fonts on the 24" look good. It looks like dog shit in comparison to the 4K screen. It might not be obvious when all you got in front of your eyes is the 24" display, but it's blatant side to side.

On top of it, the real life vertical real estate of the 4K display is also quite larger.

I've never been a big 16/9 fan, but frankly at the size monitors come in today and the market prices, I don't a reason not to pick a few of these for developing.


> 1. No matter what operating system you're on, you'll eventually run into an application that doesn't render in high dpi mode.

I haven't had this experience (MacOS, 4K monitor for 2.5 years)

> 2. If the 4k screen is on your laptop, good luck ever having a decent experience plugging in a 1080p monitor. Also good luck having anyone's random spare monitor be 4k.

shrug - 4K and 1080p seem to work together just fine for me. I've currently got a 27" 4K monitor and a 24" 1080p monitor both running off my 2015 13" MacBook Pro; the 4K is on DisplayPort (60Hz @ 3840x2160) and the 1080p is on HDMI (and it happens to be in portrait mode). I use all three screens (including the laptop's), and while the 1080p is noticeably crappier than the other two, it's still usable, and the combination of all three together works well for me. A couple of extra tools (e.g. BetterTouchTool) really help with throwing things between monitors, resizing them to take up some particular chunk of the screen, etc. - my setup's quite keyboard-heavy with emphasis on making full use of the space inspired by years of running i3 (and before that xmonad, ratpoison and others) on linux and freebsd.

> 3. Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming.

That's a statement about linux and i3, not monitors. (And again, I like i3, but stating this limitation as if it's a problem with monitors not i3 seems... odd.)


> > 3. Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming.

> That's a statement about linux and i3, not monitors. (And again, I like i3, but stating this limitation as if it's a problem with monitors not i3 seems... odd.)

It is also wrong. I am a long time i3 user. Never had a problem with it, never done anything special. Most of the time I'm running Debian stable, so I even use software versions that most people consider 'old'.


> 1. No matter what operating system you're on, you'll eventually run into an application that doesn't render in high dpi mode. Depending on the OS that can mean it renders tiny, or that the whole things is super ugly and pixelated (WAY worse than on a native 1080p display)

Never happened to me in 4 years, see below. That said, I barely use any graphical programs besides kitty, firefox, thunderbird and spotify.

> 3. Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming. I use i3 and it adds way more productivity to my workflow than "My fonts are almost imperceptively sharper" ever could

This is just not true. I have used the same 32" 4k monitor for 4 years running NixOS with bspwm (a tiling window manager, which does even less than i3) on 3 different laptops - thinkpad x230 (at 30 Hz), x260 and x395 and it all worked completely fine.

I used a script like this to setup monitors, I would run it every time I would change my monitor setup (e.g. on the go): https://github.com/rvolosatovs/infrastructure/blob/0e17a1421...

It depends on a very simple tool I wrote, because I was sick with `xrandr`: https://github.com/rvolosatovs/gorandr , but `xrandr` could easily be used as alternative.

Recently I switched to Sway on Wayland and it could not be smoother - everything just works with no scripting, including hot-plug.

> I genuinely think 4k provides no real benefit to me as a developer unless the screen is 27" or higher, because increased pixel density just isn't required. If more pixels meant slightly higher density but also came with more usable screen real estate, that'd be what made the difference for me.

Indeed, screen size is way more important than resolution. In fact, even 4k at 27" seemed too small for me when I had to use that in the office - I would either have to deal with super small font sizes and straining my eyes or sacrificing screen space by zooming in.


> 2. If the 4k screen is on your laptop, good luck ever having a decent experience plugging in a 1080p monitor. Also good luck having anyone's random spare monitor be 4k.

I have been running two 1440p displays on a 4K retina MBP and the experience has been impressively seamless. Both with Catalina (the latest) and High Sierra

The biggest problem is with Apple port splitters; they are crap and sometimes monitors wake from sleep with a garbled picture


So I'm a little nuts in that I run 2 x 27" 4K monitors side by side with no scaling. 27" is about the smallest I can tolerate 1:1 pixel sizes.

Since aging has forced me into wearing reading glasses, I wear single vision computer glasses that are optimized for the distance range of my monitors' closest and furthest points.

Because I dont have scaling enabled, I don't get any of the HiDpi issues that I've gotten on my laptops with Windows.

I have found that I am still wanting for even more screen real estate, and for a time I had a pair of ultrawide 23" monitors underneath my main monitors, but it created more problems than it solved and I recently went back to only two monitors.


I do the same with three but all in portrait mode.


That's an interesting idea. I should look into that when I eventually upgrade. A stubborn part of me left the monitors in landscape because I occasionally play games, but I end up never doing that on my desktop.

Because I prefer to use the same model of monitor when doing a grid, so I don't think I want to add to my existing setup, because my monitors are discontinued, and they're displayport only for 4K60.

I think they have more than a few years of life left in them, but I'll definitely look into a configuration like yours at upgrade time.


I find it easier to look up and down than side to side. Too many ski and skateboard crashes with whiplash when younger.


> Good luck ever having a decent experience plugging in a 1080p monitor.

A 4k monitor is now $300 (new). Used are even cheaper.

> Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming. I use i3 and it adds way more productivity to my workflow

I use Gnome 3.36 and many HiDPI issues I was having before are now gone, without any extra configuration.

> My argument in favor of 1080p is that I find text to just be... completely readable.

It is readable but fonts are pixelated, unlike 4k.

My only problem is that macOS has some artificial limitations when it comes to using non-Apple monitors. Like a lower refresh rate. My solution? Use Linux.


$300 is a lot of money where I'm from. And they aren't available at that price here, anyway.

What do you mean, fonts are pixelated at 1080p? Whether you can see the pixels probably depends on pixel size. I certainly can't see them on my 23" LG monitor unless I try really hard.


> No matter what operating system you're on, you'll eventually run into an application that doesn't render in high dpi mode.

I am using using Emacs and tmux under Linux, in the Gnome desktop. Gnome has HDPI scaling. For me, that works fine with a 4K 43 inch display. The thing you need to watch out for is to get a graphics card with proper open source (FOSS) driver support. Some cards are crap and don't come with FOSS drivers. You can get them to run but it is a PITA on every kernel update. Don't do that to yourself, get a decent card.


I miss i3 so much. But I've succumbed to laziness and have been using my various macbooks. Agree that it's a huge productivity gain, moreso than any font improvements.


Another linux+i3 user here, I've not tried 4k yet but you confirmed my suspicions.

I did a lot of research before buying an xps-13 and went with the 1080p version due to basically all the reasons you just stated + poor battery life and video performance.

I have hope for the future though... what would really make transitioning easier is a way to automatically upscale incompatible programs, even if it means nearest neighbor scaling at least it will make them usable on super hi-dpi monitors.


I have 4K monitor on laptop and also as external monitor but I have no problem with linux (using debian testing with gnome 3). I can easily combine it with 1080p monitors. Everything works out of the box.

I still switched to 1440p as 4K is just better looking 1080p. You cannot fit more information on the screen with scaling and without scaling everything is too small. I work as backend developer so space is more important for me than visual quality.


Why not scale for example 50%? The same things hold true for 1440p (or are you just able to suck up things only getting a bit smaller?)


> 3. Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming. I use i3 and it adds way more productivity to my workflow than "My fonts are almost imperceptively sharper" ever could

Happy i3 arch linux 4k monitor user here for over 2 years. I only set an appropriate Xft.dpi for my monitor size/resolution in ~/.Xresources once and that was it.

Can't imagine being easier than this tbh.


I’ve had dual 4k monitors under Ubuntu Mate for years without problems so I’m gonna have to disagree. You can pry them out of my cold dead hands.


I'm not going to argue your preferences, but then why don't you get one 50 inch 4k display? That's about four of your current displays at similar density on a single cable. And probably at a similar price point, too.

Or, if you are using decent graphics hardware, you could even get two of them and have four times more display space than you have now.


I agree with all your points, however I’ve found the mac is extremely variable DPI friendly. I think the games with custom UIs (Europa Universalis IV comes to mind) are the only things that haven’t adapted and it’s hardly a problem if you set the scaling to “large text” or whatever, just a little pixelated like you would see on a 1080p screen.


> No matter what operating system you're on, you'll eventually run into an application that doesn't render in high dpi mode.

It might depend on the program, too. Some might only work in pixels. Fortunately, it is usually not a problem if you are trying to run a program designed for Gameboy; the emulator should be able to scale it automatically, subject to the user setting. I don't know if any X server has a setting to magnify mouse cursor shapes, but it seems like it should be possible to implement in the X server. Also, it seems like SDL 1.x has no environment variable to magnify the image. My own program Free Hero Mesh (which I have not worked on in a while, because I am working on other stuff) allows icons to be of any size up to 255x255 pixels (the puzzle set can contain icons of any square size up to 255x255, and may have multiple sizes; it will try to find the best size based on the user setting, using integer scaling to grow them if necessary), but currently is limited to a single built-in 8x8 font for text. If someone ports it to a library that does allow zooming, then that might help, too. However, it is not really designed for high DPI displays, and it might not be changed unless someone with a high DPI display wants to use it and modifies the program to support a user option for scaling text too (and possibly also scaling icons to a bigger size than 255x255) (then I might merge their changes, possibly).

Still, I don't need 4K. The size I have is fine, but unfortunately too many things use big text; some web pages zoom text by viewport size, which I hate, and a bigger monitor would then just make it worse.


> some web pages zoom text by viewport size, which I hate, and a bigger monitor would then just make it worse.

Not that it excuses bad UX, but you might consider keeping your browser window at something below full width. I find this more comfortable anyway.

Total aside: I've noticed Windows and Linux users tend to keep their windows fully maximized, whereas Mac users don't. Doesn't apply to everyone of course, but enough to be noticed. This was true even before Apple changed the behavior of the green Zoom button, and I've always wondered why.


I guess macOS' window management is bad enough that treating it as a pile of paper is the only way to manage.


I find Spaces/Expose/Mission Control (or whatever they call it these days) way more comfortable than dealing with Windows. I especially like that if I hide a window, it doesn't pop up when using Mission Control. Opening the Windows equivalent shows me every window, even stuff I minimized/hid. It feels cluttered.


I don't see how alt-tabbing through maximized windows on macOS is different from Windows and Linux like the OP is suggesting. Though I do keep my browser at half-width on my ultrawide monitor because it's somewhat of an exotic/untested aspect ratio for websites.

Also any power user that cares will use a tool like Divvy on macOS for arranging windows with hotkeys.


For one, if you have multiple max-size windows of a single application in Mac OS X, alt-tab doesn't go between those windows. command-tab does.

Alt-tab goes between applications


> I've noticed Windows and Linux users tend to keep their windows fully maximized

Interesting, I've noticed the exact opposite. Mac devs, especially younger ones, tend to have full-screen IDEs and browsers and constantly flick back and forth between apps. My theory was always that Windows and Linux users had gotten comfortable with the desktop metaphor while a large percentage of newer Mac users grew up using iPads which were all full-screen, all the time.


Quick note I perhaps should have clarified, I wasn't thinking about the Mac's "full screen mode". This was something I noticed about other students in my high school a decade ago (why it's coming to mind now, I have no idea), before full screen mode existed on Mac.

It used to be that if you clicked the green button on Mac, most apps (not all apps, for weird aqua-UI reasons, but certainly web browsers) would grow to fill the screen without outright hiding the menu bar and dock, just like the maximize button on Windows.


My experience pre-full screen on macs was that the green button would do just about any random thing except make the window fill the screen. It would certainly change, usually it would fill vertically (but not always) but almost never horizontally.

To this day I still rarely press that button because of years of it doing nothing but unpredictable nonsense.


What X server? You should be using Wayland. X does not intrinsically support automatic scaling. Wayland does.


The protocol is irrelevant. It isn't an extension to the protocol, and would not be seen by clients. It is an implementation detail.


Technically, that's true, but "Wayland is inherently better at security/DPI scaling/other" is one of those cultural myths that eventually come true because of the people who believe in it. It would be possible to add these improvements to the X server, but no one wants to maintain or improve the X server anymore. All the developer effort is behind Wayland. So to get those benefits, you have to use Wayland.


Im on Gnome and use fractal scaling . 2x and everything got too big. But 1.6 looks OK. Its actually not on the app layer, its the screen that is scaled up. Although some low level programs can have issues with mouse pointer position if they dont take into account the scaling.


i3 user on a 4K screen here, has worked fine since 2014 for me (with the exception of the random super old TCL/Tk app) ? https://i.imgur.com/b8jVooO.png


Since nobody else has mentioned it, if you like i3 you should give Sway a test drive. Wayland still has some rough edges (screen sharing, for example) but it supports high DPI monitors with fractional scaling almost out-of-the-box.


> I genuinely think 4k provides no real benefit to me

Could you clarify when in your life you've had a 4k screen with good OS support, so that we know what experience you are speaking from when you say that?


One can definitely still see the pixels in a 4K 24'' monitor. That is not the point.

But I do agree with points 1 and 2 (they tend to work better on windows, though).

On the other hand, what about 3? I would find it ridiculous that it'll take you more than 5 seconds to enlarge DPI (no multi-monitor) even on the weirdest of X11 WMs. X11 is designed for this....


Same here.


I feel like HN as a whole want blogs to come back, and part of that is well written exposition (the technical details) followed by near-outlandish opinions that tend to generate discussion. It's more fun, it leads to more conversation (here we are!), and clarifies our own values.

I think three monitors is distracting, but that whole community discussion gave me the opportunity to compare how I use a multi-monitor setup compared to others.

I generally dislike how fonts are rendered at floating-point scaling or lower resolutions, but I too just learned to live with it over years of ignoring it. Unfortunately, now I can't unsee it!


> I think three monitors is distracting

Could you elaborate on this? Specifically, do you feel that having three/multiple monitors is necessarily distracting (no matter what you do with them) or just encourages distraction?

My experience is the latter, and that if I am disciplined and only put windows from one task at a time on all three, then I have no more temptation to be distracted than I usually have at my internet-enabled glowbox, but maybe I'm an outlier.


For me, I can't use three monitors. It just takes too much brain-cycles to process and remember "where that window is".

I can process two fine: one "the workspace" the other "the reference/product/outcome/tests".

I tried really hard to teach myself structure, but with three monitors, I find myself always loosing windows, losing the cursor etc. Hacks like "find my cursor" (not default on Linux) help, but I'd rather have a clean mental map of where my cursor, focus and windows are, at all times.

What stuck best was having the third monitor hold the API/documentation/reference but still, the mental power to keep all my whereabouts mapped in my head were just too much.

Also note that I went from one to three monitors, so this is not "just too used to two monitors to ever change" it was the other way around.

Ubuntu has nice tiling features without requiring tiling for everything ([meta]-[→] and [meta]-[←]) that allow my now-ingrained use of two monitors on my laptop-screen when I'm somewhere without a second monitor. Another thing that I disliked about my three-monitor-setup: it does not map in any way to a single monitor: using virtual desktops worked, somewhat but still too different.


My experience is that my attention span sucks enough that there's not a functional difference between encouraging distraction and being necessarily distracting.

I do basically all my work on my laptop without an external display. I use two displays for some specific tasks where I need to look at two things at once.


As for me, three monitors is just too much information. There's no thing I do that requires me to see so many things at once. I do make extensive use of virtual desktops instead.

I like one large 4k monitor right in front of me, and the window manager is set to allow splitting windows in thirds. Laptop off to the side with music, terminal, and other auxiliary stuff.


So it needs more discipline to use them, and I need to apply discipline for so many things already. I'm much better off with just the one screen.


>I feel like HN as a whole want blogs to come back

Blogs never left— they may have withered to near-nothingness after the same exact people who claim to want them abandoned them, but they’re still here.

I may be wrong but I believe the people who claim to want blogs abandoned them for sites like Hacker News.


HN is a link aggregator that gives a lot of blogs more exposure than they would otherwise get. I would say something like Twitter (micro-blogging and social networking) has contributed to decline of blogs.


If you are going to blame anyone, it's easiest to blame Google's failed EEE attempt on the blogging world with sunset of Reader and attempt to push everyone to Google+. There were so many blogs and meta-blogs I saw directly lost in that fumbled transition. In trying to build their own Facebook, Google did irreparable harm to the blogging world in the process.


I'm certainly no fan of Google, to say the very least. That said, they've kept Blogger relatively unchanged, which is nice.

At least count your blessings that making blogging platforms must not impress promotion committees nearly as much as writing new chat apps.


"Relatively unchanged" is such an interesting POV. It's been in such a stasis that I pretty much assume it is as dead as LiveJournal. Maybe not in literally the same way that LiveJournal got shuffled around in a shell game to some strange Russian owners, but in a very metaphorical sense.

At one point Blogger in the early oughts was ahead of the pack in leading mainstream acceptance and usage of blogs. At one point my feed list was over half Blogger sites, but today I can think of only a few Blogger-hosted blogs left at all in my feed list, none of them have been updated recently, and those that have updated recently were claimed by spammers and (sadly) dropped from my list.

I can't imagine there's much more than a skeleton crew at Blogger keeping the lights on, and I would be unsurprised, if in pushing the metaphor to the LiveJournal thing, to learn that they were being kept in the Bay Area equivalent of Siberia by some half-mad state actors that need Blogger's zombie to keep its current undead behavior in some strange psy ops nightmare.


I just can’t function optimally without 3 monitors:

1 for my full screen IDE: 3 files side by side, multiple terminals

1 with 2 browser windows: my product (a web app), and another with specs, stack overflow, google

And my macbook’s screen with slack or email.

Whenever I have 2 monitors or less, I find myself switching windows around all the time! With 3 I just move my eyes and mouse.


I love my 3 monitor setup. In 20 years of coding, I finally reached peak productivity in my workspace. No switching between windows, ease of throwing a window aside for monitoring, laying out information and tools side to side. It’s incredible once you readjust your habits and learn to use that new space. I compare it to having a large, wide desk when you’re working on something. Who wouldn’t want that?

I was one of the people who worked on 15” retina MBP everywhere, even in the office where I had 30” on my desk, to not have to readjust and keep optimal habits for the screen size. Now I simply refuse to work on a laptop at all, it feels like being trapped into a tiny box and I get literally claustrophobic :)


> I compare it to having a large, wide desk when you’re working on something. Who wouldn’t want that?

I have used between 1-3 monitors over the last decade, and there sure are advantages to having 3 for certain tasks. However, I noticed that having multiple monitors resulted in me having a dedicated screen for email (usually the smallest, my laptop screen). This decreased my productivity.

Perhaps not everyone has this weak spot, but for me using multiple monitors has a downside from an attention/focus perspective.


Coronavirus has robbed me of one of my favorite productivity hacks, which is coding on old Thinkpad with 4:3 ratio display, at a coffeeshop or library with Internet turned off. No distractions, no multitasking, just pure focus on a problem.

I miss it.


My friend, when I moved into my new apartment I went without WiFi for 2 years and did nothing but code on my T43. In that time I managed to rewrite (a variant of) Age of Empires 2.


> (a variant of) Age of Empires 2.

What's this?


https://github.com/glouw/openempires

I chip away at it now every other night. It's namely a simple multiplayer client. Britons is the only civ.


My home internet went out yesterday (I WFH full-time).

“No big deal”, I thought. I’ll just go to Sta... Dammit COVID!


Which old thinkpad?


With a 4:3 ratio, probably a T60 or a T61.


Right on. T61, with one of the best thinkpad keyboards ever.


Those keypads, you could throw anything to them and they would still give you that travel, click and satisfaction.


IMHO, two monitors is an amazing upgrade. One screen for code, another for reference material or for the app being debugged. Better than one huge screen in many cases, as it’s two 16:10 spaces that you can maximize things to.

But with a 3rd monitor, you’re well into diminishing returns, it may even end up being a distraction if it becomes a Slack/Mail/web screen.


I think most of the need for more monitors can be solved with a tiling window manager.


> I noticed that having multiple monitors resulted in me having a dedicated screen for email (usually the smallest, my laptop screen)

I'm currently using two external monitors, with my laptop docked and closed. I find the "dedicated screen for <distraction>" was a problem for me when I had my laptop screen open, because it's a different size/resolution/position than my actual monitors. On the other hand, I never have that problem with my dedicated monitors - in my mind they're a part of the same "workspace" because they're the same size, resolution, and positioned together - so I could see myself going to 3 desktop monitors one day.


I have two monitors right now and wish I had a third. One for code, one for documentation, and one for running whatever I'm working on (website, android emulator, etc). Currently I have the code monitor vertical and swap between workspaces on the horizontal monitor for the running thing and documentation.


I've solved this problem by having my email client (actually it's Slack in my case, but the same principle) and terminal share a screen. This works pretty well because I rarely want to use my terminal and chat at the same time.


When I need to concentrate I just turn off my two side ones and focus on the middle one.


I understand this and I don't want to argue against anyone's preferences. You know what's best for you.

I'm just pushing back against these out of touch fads. Most developers worldwide don't have a three-monitor setup. There is no proven correlation between quality software and 4K displays or mechanical keyboards (to name other fads). More importantly, the best devs I've known -- people I admire -- used tiny laptops with tiny displays, and shrugged when offered even a single external monitor; it just wasn't a big deal for them.


Mechanical keyboards don't make you a better coder.

But the only thing that will is writing lots of code -- over years and decades. And about 12 years ago, I started running into this anti-feature of human physiology known as "aging". And whereas in my think-I'm-so-l33t 20s I could bang out code on crappy desktop and laptop keyboards, by my 30s they were turning my hands into gnarled claws.

The remedy for this, for me, was a keyboard with Cherry MX switches. The crisp feedback let me know when a stroke was registered, so I unconsciously pressed each key less hard and was able to type faster with less pain.


Yeah, I wouldn't say having a mechanical keyboard makes your code any higher quality - that'd be pretty silly.

I think in general the thought is, if you care enough about your craft that you seek out refined tools, that care will be reflected in higher-quality development. Whether that's true or not, I don't know, but I'm inclined to believe there's a correlation.

I mean, it would be weird to visit a professional carpenter's house and see Harbor Freight tools, right?


Thanks for the reply. I think there is little to no correlation, but like the opposite opinion, I've no proof other than the anecdotal: the best hackers I've known didn't care about these things.

Other bizarre opinions I've read from Atwood and his followers: that you should be an excellent typist (this is also related to owning a mechanical keyboard). No. Just no. Typing speed is not the bottleneck when writing software. The bottleneck is my brain. I've never seen a project fail because people typed too slowly.


I do think there's a "CrossFit" mentality among the typer-coders who swear by mechanicals and end up with wrist braces - a kind of "more is more" approach that drives them to write lots of code, put in lots of hours, memorize innumerable details, and min-max their output in Taylorist fashion. It's optimizing for reps, versus mobility, stability, flexibility.

I have let my WPM drop a fair bit over time. I'm still relatively young yet, but I see no reason to go fast when I realize that most of the typing amounts to disposable bullshit. It's better to spend time thinking and developing thought patterns, and then just type a little bit to jog your mind and clarify. I allow myself to write some cheap code, but the point of that is to sketch, and the sketch should be light and quick, for the same reason that artists will say to favor long, confident strokes instead of chicken-scratch markings.


My main gripe here is that as time has gone on, and I've racked up the RSI's, is that the brain-to-text latency has gone up notably.

This scares the shit out of me. I'm not in the older subset of programmers (<30 atm), and this has gotten to the point where the latency actually affects my workflow.


I think the Python ethos applies directly to typing speed: "code is more often read than written".

I agree, if speed of your typing is your bottleneck in getting code written, perhaps you should be coding smarter not harder.

I think there is some wisdom that you should try to be a "good" typist, in that better typing skills reduce the risk of injury (RSI), but that's self-care/ergonomics, and while still very important, there are plenty of good software developers that hunt-and-pecked their way to an early retirement (and/or multiple carpal tunnel surgeries).


I've had a phase of getting mechanical keyboards, but I always found myself typing slower on them. The added travel time, even on the "low profile" mech keyboards was making me type slower. I am back to scissor switch and I couldn't be happier. Although I prefer the low profile keyboards in general. One of my favourite keyboards is the butterfly Macbook keyboard, but I know it has mixed opinions.


> Typing speed is not the bottleneck when writing software. The bottleneck is my brain.

I agree except with a caveat: the mechanical action of typing, formatting, refactoring, fixing typos and missing semicolons, and moving code around actually distracts the brain from the higher level task at hand. And when the brain is already the bottleneck, I don't want to make it worse by wasting brain cycles on mechanical drudgery.

As one might expect, I feel far more productive when I'm using languages and tools that require me to type less and refactor & re-edit code less. I think the language would matter less if I could just wish code onto the screen. Until then, learning to touch type (with as few errors as possible! not necessarily as fast as possible) and use editor features to make it more effortless is the next best thing.


It’s the opposite for me, having fewer monitors makes it harder to find a window. I have to go through alt-tabbing slowly to get to the one I’m after.

With three monitors, I know exactly where my windows are. If I have more than three windows that’s annoying, but I keep the extra ones on the middle monitor to simplify things.


Did you reply to the wrong person? They were talking about typing speed, not monitors.


Yes - this was the comment I was replying too: https://news.ycombinator.com/item?id=23561120


Typing speed is not the bottleneck when writing software. The bottleneck is my brain.

I see typing like the ability to do mental arithmetic: being able to do it well isn't the thing that's directly valuable, but it removes distraction from the activity that is valuable, and that ends up making it valuable as well.

Another way to look at it: the faster you think, the faster you need to type in order for it not to become the bottleneck (during the times where you're actually implementing an algorithm as opposed to designing it). Of course, that's not just a function of raw typing skill, but also of the tools you use and the verbosity of your programming language. (An amusing corollary of this is that for society at large, it's useful to have great hackers who are bad typists: they will be tempted to develop better tools from which the good typists can also benefit!)

I've never known a great developer who did hunt-and-peck typing though. I do know great developers who have their own typing system. They simply never bothered to learn the "proper" way to do ten finger typing, and that's fine (unless those typing systems are worse for RSI, which was the case for me personally).


I understand what you're saying, but that's simply not my experience (either with myself or observing others).

Note Atwood claims you must be an excellent typist, training yourself to become one. I find this fetishization of a mechanical skill bizarre. I'm not advocating clumsily struggling with the keyboard like an elderly person, but past the point of "I'm a decent typist", I find that's enough.

I find there's no correlation between the problem-solving ability needed to be a great software developer and being a fast typist of the sort Atwood et al advocate.

I file this under "weird things some programmers believe without evidence" ;)


Yeah, I think we agree on the excellent typist point. It needs to be fast enough, but I suspect what happens is that pretty much everybody using a computer sufficiently to become a great developer reaches "fast enough" naturally through implicit practice.


I agree, but would say that in real life most of the people I know who like mechanical keyboards like them for hand strain reasons. They find them more comfortable to work with. While the code written is the same on both, that things like 4k monitors (eye strain) and mechanical keyboards (hand strain) are better for the long term health of the programmer. I've not gotten on the 4K train, but do like having my keyboard for that reason.

Multiple monitors though is purely personal preference I think. While having the documentation on another screen is something I personally find useful, if anything it probably makes me lazier about trying to remember things.


> that you should be an excellent typist

I think it's valuable to be able to type effortlessly without having to think about it too hard. Typing is a distraction that takes brain power away from the important things.


Typing speed is probably not the bottleneck but I found that since I started touch typing I get a lot less headaches because I don't have to look up and down all the time.


Unlike carpentry, the quality of our tools (keyboard and monitor, specifically) don't affect the quality of our output.

I think in many cases, people hide the fact that they're not competent with high-cost professional tools, because laymen use it as a proxy for talent that they cannot evaluate.

I think that's also why many exceptional programmers just use a 5-year old laptop -- they don't need to compensate.

A day-trader having 12 monitors mounted on the wall doesn't make him profitable.


I still use my 11 year old Thinkpad together with my newer Thinkpad. I use both laptops alternately. My productivity does not increase when i use the newer one.

That's said my extra 20 inch monitor helps me to visualize.


I've seen plenty of professionals using Harbor Freight tools, usually not carpenters but the tile saws and wrenches seem popular for professional use.

The correlation seems more likely to me that if you can afford the fancy tools then you've already had some level of success. Though there are those new mechanics who bury themselves in a mountain of debt buying a whole chest full of Snap-On stuff...


A professional will eventually wear even the best tools. And since they use those tools every day, they can keep an eye on wear. So it's not that crazy for them to use relatively cheap stuff.

Plus, a tradesman will also sometimes lose tools, drop them in places they can't recover them from and so on.

On the other hand, the day I want to fix some issue at home, the last thing I want is the tool I use perhaps once a year to be an additional source of issue, because it involves a round trip to the store.


Well the old Harbor Freight stuff was REALLY good, then they sold out and it turned to crap.


In the last few years they've addressed that. Their Chicago Electric tools should generally be avoided but the Vulcan line of welders and Bauer/Hercules hand tools are all perfectly serviceable for light/occasional use.

The issue with heavy use is not that they don't work but that they're heavier, less ergonomic, and less robust/repairable than the name brands; if you can afford the name brands and will be using the tool until it breaks, fixing it, and then using it more then you'll want to go with the name brands.


20 years ago when GeForce 2 MX appeared I switched to 2 monitors; in the next 6 months the software department (that was the name) switched to 2 monitors by contagion; I was in the infrastructure department, they just saw the benefits. Since then, I never worked with less than 2 monitors. I can use productively 3 if I have it, otherwise (and most of the time) I use 2.

I am not a good developer, it is not my job, but I started coding on a Commodore 64 in text mode, then I did Cobol and FoxPro for DOS on 80x25 screen with no problem. But when larger monitors appeared, I used it, when the possibility to use more than one monitor appeared, I used it. It is a case of technology helping you, not making you better but helping - I am more productive using 2 monitors than limiting to just one. Because of this, I use the laptop (1366x768 screen) only as a portable email tool, everything else is on a pair of 24" monitors, in the office or at home. Sometimes I pull a monitor from another desk (in the office) or other computer (at home) when I do specific work that benefits of 3 monitors, but it is not a matter of preference, just specific use cases where 3 is better than 2.


May favourite dev environment was a 7" Android 4 running Debian. I got plenty done with an external keyboard.

I bang away at my mba 13" 2013 these days and the only real gripe i have is the lack of delete & backspace keys combo: I've never gotten comfortable without it.

That said, the only reason i could possibly use more screen real estate is web debugging. But to me that's more of an indictment of the environment I'm "coding" in.

The only time ever needed two monitors was back writing 3d games on a 3dfx (before nvidia head hunted their engineers) and needed to debug something while running full screen.

While i understand this argumentation, to me, monitors, their size & their number have always been pretty much...meh. instead it's the quality of the monitor itself (refresh rates, contrast, brightness)


> May favourite dev environment was a 7" Android 4 running Debian. I got plenty done with an external keyboard.

Wow, that's hardcore! I don't think I could do it though. Thank for sharing your experience!

Which 3dfx games you collaborated with, if you don't mind sharing? (I never owned one, I was a TNT fanboy myself).


> ... the only real gripe i have is the lack of delete & backspace keys combo

I’m not sure I follow. Are you complaining about the lack of a dedicated Delete key on Macs, having to use only one key for both Backspace (Delete key) and Delete (Fn + Delete keys)?


That personally bothers me a lot, as does the lack of home and end keys. Yes I know there are key chords to accomplish the same function, but having a dedicated key as part of your workflow makes a big difference. Maybe if I worked on my Mac 100% of the time I wouldn’t mind, but I only use a Mac about 20% of the time and it is incredibly infuriating.


Agree with other commenter who said you know what's best for you. Good job on iterating toward an optimal setup!

But I will tell you why multi monitor setups aren't the best for me. It doesn't feel like having a nice big desk to work at; rather, it's like having a separate desk for each monitor, and I have to move from one to the other to use it. With more than one centered monitor, I have to move my head to look between them, or my entire body, so that I can face straight towards whichever monitor I'm currently looking at. I've tried going back to multi monitor setups many times and every time I get tired of it faster due to straining my neck, eyes, elbows and shoulders with all that turning-to.

For me, it's one very nice monitor, with my laptop plugged in in clamshell mode (although now I leave my rMBP cracked so I can use TouchID).

I've also been using a window manager (Moom) with hotkeys to be able to set up three vertical windows on my screen. That seems to be the sweet spot for me: I can have multiple different code editors, or editor+terminal+web, or throw in email/slack/whatever into the mix. (I can also split a vertical column to two windows to achieve a 2 row x 3 column layout, and lots of other layouts, 1x1 vert/hor, 2x2, centered small/large...) I feel like I've arrived where you're at, my perfect setup!

I also still enjoy the 13" rMBP screen, although I can't get to 3 columns, and lately the keyboard hurts my wrists after extended usage. I use a Kinesis Freestyle 2 with the monitor+rMBP which has been absolutely fantastic for typing ergonomics.


> Who wouldn’t want that?

I don't want that. There was a time when I used multiple monitors, but I've found that just working on a laptop works better for me. It's less distracting, and I find switching between windows to be both faster and less disorienting than turning my head to look at another monitor.

I can definitely understand other people preferring multiple monitors, but not everyone has the same preferences.


I appreciate your favour. I am probably an exception, but i like to code on just a single 15 inch MacBook. I switch screens by pressing key combinations. I believe faster, but also more convenient than moving your head around constantly. for me everything must be accessible by various key combos. once I have that working I hardly need to use the trackpad mouse anymore. the truth is that most people in my team's work with dual or triple screens.


I have always used a single monitor and I am productive like crazy. I mainly need a code editor and terminal. Sometimes switch to a browser and back. And that's enough. More monitors doesn't automatically imply more productivity IMO. Maybe in some specific cases. You can't focus on all monitors simultaneously anyway.


I desperately want to use the new iPad Magic Keyboard with my iMac.

I LOVE the trackpad & keyboard combo on my MacBookPro. For a while, I was using Teleport to use my laptop as the input device for my iMac.

I do occasionally use the Magic Mouse for drawing (direct manipulation) tasks.


I got a 6 monitor stand for my home office, filled it with 22-24" screens, and never looked back. It is phenomenal.


How's the electric bill?


Offset by less furnace run time I'd bet.


Have you used a 4K display, for at least a few days? If you have, I still disagree with you, but if not, I’m going to completely ignore your opinion, because I find the difference in how pleasant it is to use a good screen just so vast. Sure, you can do things on a lousy monitor, but it’s terrible and you’ll hate it. :)

(My first laptop was a second-hand HP 6710b at 1680×1050 for 15″, and that set me to never accepting 1366×768. So my next laptop was 15″ 1920×1080, and now I use a 13″ (though I’d rather have had 15″) 3000×2000 Surface Book, and it’s great. Not everyone will be able to justify the expense of 4K or similar (though I do think that anyone that’s getting their living by it should very strongly consider it worthwhile), but I honestly believe that it would be better if laptop makers all agreed to manufacture no more 15″ laptops with 1366×768 displays, and take 1920×1080 as a minimum acceptable quality. As it is, some people understandably want a cheap laptop and although the 1920×1080 panel is not much dearer than the 1366×768 panel, you commonly just can’t buy properly cheap laptops with 1920×1080 panels.)


> Have you used a 4K display?

I'll go one step further: I used the LG 27" 5K Display for two whole years before returning to a 34" Ultrawide with a more typical DPI.

Obviously I preferred the pixel density and image quality of the high-DPI screen, but I find myself more productive on the 34" Ultrawide with a regular DPI. (FWIW, LG now has a newer pseudo-5K Ultrawide that strikes a balance between the two).

I look forward to the day that monitors of all sizes are available with high DPI, but I don't consider it a must-have upgrade just yet.

Also Note that Apple made font rendering worse starting in OS X 10.14 by disabling subpixel AA. Using a regular DPI monitor on Windows or Linux is a better experience than using the same monitor on OS X right now. If you're only comparing monitors based on OSX, you're not getting the full story.


Just want to second the use of an ultrawide (3440x1440). It's such a better experience all around. I can't go back to a non-ultrawide monitor.


I also have 3440x1440 on my Dell monitor at home, and I love it.

My work monitor is a really nice 27" 4k LG monitor, which a coworker picked out. He's a real monitor specs nerd and made a lot of assertions like the OP. The scaling issues are endless and really bother me, and I don't notice the higher PPI at all. I much prefer the ultrawide Dell - it gives me a feeling that I don't even need to maximize my windows and I can still have lots of space.


I upgraded from dual 34" ultrawides to one 49" super ultrawide and won't look back. 5140x1440 on a single monitor at 120hz.


That's just two monitors in one that you cannot rotate into a two-pane, effectively 2880x2570 configuration. Nice that there no division, of course.


It takes up zero desk space (one monitor arm that lets me adjust it anytime I want), and I don't need to rotate it. I've never found that a useful thing to do.

On the other hand, when you're done work its amazing for flight simulator or other games that support the aspect ratio properly.


But the DPI is low, about half of the retina DPI the article author is on about.


I used a 4K display at work for about 18 months. I still won't trade my 1680x1050 monitors for anything else until they die.

I don't mind the pixels, as long as the font rendering system is good.

Of course I understand the need for a Retina display if you're working on macOS...


Not sure I follow. Are you saying that font rendering on macOS is bad?


Not parent poster but, as somebody who works on macOS, my problem is that the switch between the high-density laptop screen and a “pixelated” external one is extremely jarring. At one point I had an additional, average Dell monitor and every time I moved my eyes I cringed. After a couple of days I just removed it - better to have fewer screens than forcing my eyes to readjust every few minutes.

So yeah, if your primary device is a modern Mac, you really want a high-res, high-density screen.


Other commenters on this thread pointed out that Apple reduced the subpixel rendering quality when retina display were introduced.


Yes, Apple never completely embraced the somewhat hacky techniques of font rendering (hinting and subpixel AA), thus their fonts always appeared less crisp.

Desktop Linux had the same problem for a while.


Not necessarily "bad".

Font rendering on macOS is more "authentic" to the original shape at the expense of clarity from hinting/pixel-snapping.


4K display isn't bigger though. It only has better resolution, the size of all graphic elements in centimeters stays the same.


That’s the whole point in question: that higher resolution for a given size is an extremely good thing.


Not really. There is an end point. Also, resolution is not free, this is a tradeoff, when you push more pixels on screen it means more work for the GPU (or CPU in some cases) and more loading time and space to load all those high-dpi resources...

At this point I clearly prefer lower latency and higher framerate over more pixels.


Sure, to a point. But the step-up in question here (1080p to 4K, at sizes like 15–27″) has clearly visible benefits.

And sure, resolution isn’t free, but at these levels that was an argument for the hardware of eight years ago, not the hardware of today. All modern hardware can cope with at least one 4K display with perfect equanimity. Excluding games (which you can continue to run at the lower resolutions if necessary), almost no software will be measurably affected in latency or frame rate by being bumped from 1080p to 4K. Graphics memory requirements will be increased to as much as 4×, but that’s typically not a problem.


When my monitors die (they are already 12 years old) I'll switch to 120Hz or more, but I am still not convinced by the high-dpi monitors I've tried.

Also, I don't like using non-integer resolution scaling, the results are always a bit blurry/unpleasant, even on high-dpi monitors.


That's true, undeniably. But I think the better tradeoff still is to go up in size: IMO the optimal monitor size is 38". Big enough, but not too much head turning. Would I get a sharper 38" if possible? Sure. But I wouldn't compromise on size to gain higher DPI.


I have a laptop with a 13" 3200x1800 monitor; not quite 4k, but higher resolution than my eyes' ability to discern fine detail. My other laptop is 1366x768. The other laptop is a lot better for a wide variety of reasons (and also an order of magnitude more expensive) but the display resolution is genuinely something I don't give a crap about. It isn't terrible and I don't hate it. There's plenty of stuff I don't like about the display; narrow viewing angle, poor contrast, prone to glare- but the resolution is fine.


Having used CRT monitors, 1920x1080 displays, 4K displays and 5K displays, as well as various Retina Macbooks over many years, mostly for coding, here's my opinion:

The only good solution today is the iMac 5K. Yes, 5K makes all the difference — it lets me comfortably fit three columns of code instead of two in my full-screen Emacs, and that's a huge improvement.

4K monitors are usable, but annoying, the scaling is just never right and fonts are blurry.

Built-in retina screens on macbooks are great, but they are small. And also, only two columns of code, not three.

One thing I noticed is that as I a) become older, b) work on progressively more complex software, I do need to hold more information on my screen(s). Those three columns of code? I often wish for four: ClojureScript code on the frontend, API event processing, domain code, database code. Being older does matter, too, because short-term memory becomes worse and it's better to have things on screen at the same time rather than switch contexts. I'm having hopes for 6K and 8K monitors, once they cost less than an arm and a leg.

So no, I don't think you can develop using "tiny laptops with poor 1366x768 displays". At least not all kinds of software, and not everyone can.


> So no, I don't think you can develop using "tiny laptops with poor 1366x768 displays". At least not all kinds of software, and not everyone can.

This opinion seems bizarre to me. You start by offering personal (and valid) anecdote, then end up saying "I don't think you can develop [...]". But this flies in the face of evidence. Most people by far do not use your preferred monitor setup (iMac 5K) and in my country a vast number of developers use 1366x768 to develop all sorts of high quality software.

It's one thing to say "as I grow older, I find I prefer $SETUP". No-one can argue with that, it's your opinion (and it might very well become mine as I grow... um, older than I already am!). It's an entirely different thing to claim, as you do here and I think TFA does in similar terms, "you cannot prefer lower tech setups", "you cannot develop software this way", "it's very difficult to develop software without $SETUP". The latter is demonstrably false! I've seen it done, again and again, by people who were masters at their craft.


I don't think they were doubting that somebody does develop in those random setups, they were disagreeing with the people that say it doesn't matter and you can code anywhere. In your quote, a royal you.


But they are not random setups. They are extremely common setups in my part of the world. People -- who are pretty good at what they do -- can and do develop using these tiny screens. In this regard, "it doesn't matter". Or taking less literally, they wouldn't complain if they got a better monitor, but it's not the primary concern for them. So taking a cue from TFA's title: "no, it's not time to upgrade your monitor".


Random as in any one of a variety of somewhat common setups, I didn't feel like typing a resolution. It backfired on me here.


Following your logic, I can't claim anything, because there is always someone, somewhere, who will come up with a contrary opinion.

I do respect your opinion, but I still hold to mine. I also think this discussion won't lead anywhere, because we are glossing over the terms. Not every "developer" is the same, not every "software" is of the same complexity. I can fix CSS in a single 80x25 terminal, I can't do that when thinking about changes in an ERP system.


Note I do not dispute your opinion. You're entitled to it and you know what works for you.

Regrettably, following (my) logic does mean you cannot say "you [the generic you] cannot develop like this", because this is indeed easily disproven. People can and do (and sometimes even prefer to). That's the problem with generalizing from your personal opinion ("I don't like this") to the general ("people cannot like this").


Ya, the iMac 20" 5K is deal. I got an LG 28" 4K just to interface with a MBP..it was a painful compromise but stand alone 5K monitors are just too expensive right now.

When we are talking pixel count, we have to talk about size of the display also. a 28" 4k is acceptable, a 40"+ 4K is best used as a TV.

The best display I use right now is a 11" iPad Pro with a refresh rate at 120 Hz. You really can feel it, especially for stylus work.


Do you know of any standalone monitor that is close to the iMac 5K?


Atwood is a bit of a prima-donna and part of being a blogger is a to make big statements.

One of my roles for a long time was speccing/procuring computer equipment (later overseeing the same) for a huge diverse organization. I took feedback, complaints from users and vendors, etc. People are passionate about this stuff... I was physically threatened once, and had people bake cookies and cakes multiple times as a thank you for different things.

The only monitor complaints I recall getting in quantity were: brightness, I need 2, and i need a tilt/swivel mount. Never heard about resolution, etc. Print and graphic artists would ask for color calibrated displays. Nobody ever asked for 3, and when we started phasing in 1920x1080 displays, we literally had zero upgrade requests from the older panels.

Developers wanted 2 displays and SSDs.


You dont need a good display until you have used one. Its like glasses, you think you got perfect vision and then you get glasses its night and day in comparison. You dont know a good display until you have seen one. The thing with high res monitors is that you should upscale or everything will look tiny.


I think it's a complicated topic.

IMO, there are two key criteria for monitors. Real estate and pixel density, and in some cases, you can't get both affordably.

I have had 15" laptops with 4K displays for some time now. I love the pixel density. But I can't do certain types of tasks on them because I end up getting real estate anxiety. I feel so constricted on a laptop screen, even when I add a second monitor.

My desktop has 2 x 27" 4K monitors running without scaling. So I have plenty of real estate, but the text could look nicer. Having said that, I don't miss sharp text in the same way that I miss real estate when using my laptops, at least from a productivity perspective.

I don't think a 5K screen is an answer for me, because my first urge would be to try to use it without scaling for more real estate.

On the other hand, a pair of 27" 8K screens (does such a beast even exist, and is it affordable?) would be ideal, because 1:1 scaling on such a beast is impossible at that monitor size, but 200% scaling would basically give me the same workspace real estate that I have now but with super sharp text.


A long time ago, resolutions got better, but when we got more pixels...we didn't make the characters smaller, we just made them more crisp. What changed? Why is the transition from 110 PPI to 220 PPI being handled differently than the one from 50 PPI to 110 PPI?


> What changed? Why is the transition from 110 PPI to 220 PPI being handled differently than the one from 50 PPI to 110 PPI?

Well that is a user setting, and I tend to think the first retina iPhone is what started the trend.

Also, at certain screen sizes, 1:1 scaling is just impractical due to limitations of the human eye.

My 27" 4K screen is pretty much at my personal limit in terms of what I can handle at 1:1 scaling for my normal viewing distance. Had I gotten a 24" 4K screen, I'd have to set the text scaling to ~125% which would also translate into some lost screen real estate.


> Also, at certain screen sizes, 1:1 scaling is just impractical due to limitations of the human eye.

Scaling isn't even relevant. It makes sense to talk about scaling pixel artifacts, but not vectors one. The font is 8pt or 20pt, you can configure that on your own. 1:1 scaling is just some arbitrary and artificial fixed pixel/letter seen.

A 24" 4K screen (basically a 24" imac) had text the same size as a pre-retina 24" imac, they just used more pixels to render that text.


That's not the way Windows works though. Can't speak for Linux desktops.


Windows 8+ does, windows 10 has no problem with resolution independence. Some legacy apps that run on windows break, but not something modern like Visual studio, chrome, etc...


Windows has issues with HiDPI once you have multiple screens with different densities, and it's not just with legacy apps.

In any case, Windows is just using a scaling feature. But when you scale, you are losing workspace in exchange for sharpness.

When 4 physical pixels translates to 1 logical pixel, you've actually lost 3 pixels of potential real estate on a screen size where your eyes can actually discern those pixels.

On a 15" 4K monitor, losing that real estate is not a big deal, because >90% of people can't easily read or see anything scaled at 1:1 on a screen that size. When you scale that screen at 200%, you're basically using a 1080p screen with sharper text. You don't gain any real estate at all from the higher resolution.

On a 40" 4K screen, it's a whole other story. The text may not be sharp, but you can have way more windows open on that screen, which makes it easier to multitask. It's like having a grid of 2x2 20" 1080p screens.

-addendum - my visual limit is 1080p @ 1:1 on a ~14" screen, which is why I am fine with no scaling on my 4K 27" screens.


I agree and I'm also a monitor prima-donna! That said, most people have not been tainted. :)


If you use an OS that handles high DPI very well, and where 98% of all apps handle it just as well (including resolution scaling), it is an absolute joy to my eyes to be able to use 4K at the same effective resolution (so basically 1080p, but double the resolution, double the sharpness).

Every time I use a 'non-retina' type of display (like an old laptop I use for testing or my 19" 1080p monitor), it feels like I'm looking through some dirty glass because of the blotchy effect.

I tried 4K on one linux environment (and documented the experience[1]), and according to numerous responses, my situation was not unique: if you try 4K on any Linux environment, and don't enjoy everything being tiny, then it's not a fun time trying to get everything to behave like you can with Apple's built in resolution scaling options.

[1] https://www.jeffgeerling.com/blog/2020/i-replaced-my-macbook...


I certainly agree that some like them, while others don't.

And some grow to like them - I used to scoff... before I tried it for myself.

I think as well, it depends on your eyesight, and also your workflows and apps. I use Rider (and previously Visual Studio), and these have a lot of panes/windows - it's really nice to be able to have the code front and centre, with debug output, logs and a command prompt in another screen, for example. Another example would be to have zoom/webex in one screen, while I'm taking markdown notes in another.

A good while back, I moved to a dual monitor setup at home (1x 3k in landscape, 1x 1200p in portrait), and a triple monitor setup at the office (1x 1080p in landscape, 2x 1200p in portrait). I also use a Dell screen manager, so I can easily setup regions that windows can snap to - for example, the portrait displays are usually split in 2 horizontally.

The triple monitor setup is admittedly gratuitous, but I'm never going back from a dual setup - it's just so convenient to have everything I need always there, instead of constantly flipping back and forth through the many windows I invariably have open. It feels like I'm context switching much less.


Oh, I do use an external 1080p monitor because I can't stand my work laptop's 1366x768 display, which usually gets relegated to email.


May I ask what specific laptop you're using? It's been a while since I've happened on that resolution myself.


No problem. Bear in mind I'm from Latin America, so we're usually several years behind the US (i.e. older tech, sold at excessive prices).

My work laptop is a Dell Latitude 7480. It's display resolution is a 1366x768. I normally use it with an external monitor, because the screen is not only small but also low quality. This... fine... piece of hardware was bought my current employer I think 2 years ago.

In all workplaces I've seen, either you get a macbook (with their high quality display) or an entry level laptop from Dell/HP/Lenovo (or some no-name brand) with a low quality 1366x768 display, which is what passes for entry level where I live. In many companies, macbooks are usually given to people who can explain why they need them, or to reward good performance. However, newer startups seem to default to macbooks.


A few years ago it was almost impossible to find a laptop that wasn't 1366x768. I'm really glad to see the makers come to their senses.


> ... too opinionated for my liking.

Put yourself in the mindset of a typography geek. Then, by default, you will care about almost all of these things. I'm not saying you should care about all of these things all or most of the time, but that's the correct mindset to put you in sync with the author's conclusions (mostly -- I don't really think 120Hz is that big a deal).


120 Hz is a huge deal. It makes every mouse move better, and that alone is worth it. It reduces the pain of using poorly designed software that adds extra frames of latency to everything. It makes it easier to track and read moving/scrolling content, and it reduces judder when playing low frame rate content (e.g. 24 Hz Hollywood movies).


I'm surprised the author doesn't mention backlight strobing, which is another huge upgrade. Kind of hard to describe to people who have never seen it though, as is the case with most of these improvements.

The second and third monitor recommended in the article have 4ms latency, so there's essentially no reason to buy them.

There are good 4k 120hz monitors with strobing for way less than the $2000 mentioned. E.g. the zisworks kit for Samsung U28H750 (first available already in 2017): http://www.zisworks.com/shop.html


Font are optimized for resolution they were designed for. Courier New is optimized for 800x600, and you can't have it on anything beyond 1024x768.


> I don't really think 120Hz is that big a deal

IMO 120 Hz monitors are overrated for pretty much everything except for FPS games (and similar fast-paced interactive titles).

I have a 144 Hz display and I would not want to go back to a 60 Hz display for gaming. At some point, my monitor changed to a 60 Hz refresh rate without telling me after some driver update. The first time I played Overwatch after that happened I could immediately tell that something was wrong, since the game just didn't feel as responsive and smooth as it usually does.


Another OW player in HN? :) I feel the same about the 144hz monitor I have, only really worth it for playing OW, the rest of the time is nice but I feel like I'm not getting enough out of it. Maybe next monitor I get will be 4k, but not sure for now.


After starting with 144 Hz monitor the difference to 60 Hz already with desktop mouse movements is so clear that it's immediately obvious if the refresh rate changed.


I get where he's coming from, but then again, that's not me. The cheapest monitor he recommends costs as much as my entire setup would cost in its current, used, state. I'm not gonna buy a single monitor for that price - and if i would, i wouldn't downscale it to the resolution [edit: real estate] i had before!


I've worked on a range of setups from dual screens, 120hz to little CRTs and laptop displays, and even a bit of graphing calculator and smartphone coding. My conclusion is that the difference depends mostly on the workflow, and what the workflow changes is mostly how much information you're scanning.

If you're scanning tons of text, all the time, across multiple windows, you need a lot of real estate, and in times of yore you would turn to printouts and spread them over your desk. A lot of modern dev goes in this direction because you're constantly looking up docs and the syntax is shaped towards "vertical with occasional wide lines and nesting" - an inefficient use of the screen area. Or you have multiple terminals running and want to see a lot of output from each. Or you have a mix of a graphical app, text views and docs. A second monitor absolutely does boost productivity for all these scenarios since it gives you more spatial spread and reduces access to a small headturn or a glance.

If you're writing dense code, with few external dependencies, you can have a single screen up and see pretty much everything you need to see. Embedded dev, short scripts and algorithms are more along these lines. Coding with only TTS(e.g. total blindness) reduces the amount you can scan to the rate of speech, and I believe that consideration should be favored in the interest of an inclusive coding environment. But I'm digressing a bit.

For a more objective ranking of concerns: pixel density ranks lower than refresh rates, brightness and color reproduction in my book. If the screen is lowish res with a decent pixel font that presents cleanly at a scale comfortable to the eye, and there's no "jank" in the interaction, it's lower stress overall than having smooth characters rendered in a choppy way with TN panel inverted colors, CRT flicker, or inappropriate scaling.

Book quality typography is mostly interesting for generating a complete aesthetic, while when doing computing tasks you are mostly concerned about the symbolic clarity, which for English text is reasonably achieved at the 9x14 monospace of CP437, and comfortably so if you double that. There's a reason why developers have voted en-masse to avoid proportional fonts, and it's not because we are nostalgic for typewriters.

And yet for some reason we have ended up with tools that blithely ignore the use-case and apply a generic text rendering method that supports all kinds of styling, which of course makes it slow.


> There's a reason why developers have voted en-masse to avoid proportional fonts, and it's not because we are nostalgic for typewriters.

This is sad in itself. Proportional fonts are vastly superior to type writer fonts in presenting code. The kerning even makes for better readability even if it comes at the expense of the ability to do ASCII art.


I use a 55" curved 4k TV as my monitor. It's not pixel density that's important for reading text, but contrast. The best thing for coding is a high contrast bitmapped font with no smoothing.

My monitor is still available at Walmart for $500. It's like an actual desktop when you can spread out all your stuff.


My personal requirement though is 4:4:4 color. Too many TVs will only accept 4:2:2 or 4:2:0 chroma, and colored text on black will look horrid.


Just stick to big brands. Even entry-level models support 4:4:4 UHD @60Hz.

The main issue with TVs is that HDMI sucks compared to DP. You'll need a half-decent GPU from 2015 or later for good HDMI 2.x support.


Some LG's cheap 4KTV have RGBW panel that's not suitable for reading text. Should check review and spec even it came from big brand.


It's a samsung tv and supports 4k at 4:4:4 60hz if I recall correctly.


> It's like an actual desktop when you can spread out all your stuff.

But you can't spread out your stuff any more than you'd do on a 24" 4k monitor...


> Suboptimal...No. Just no. The best developers I've known wrote code with tiny laptops with poor 1366x768 displays. They didn't think it was an impediment.

I've realized that a stable environment does wonders for productivity. What you got used to (stable) is path-dependent and personality-specific. But it doesn't matter. What matters is to train yourself to sit down and work. If you're opening that laptop lid and thinking about visiting Hacker News first, shut it down, get up, walk away from the desk and then retry. At this point, I feel like doing a "productivity, productivity" chant / rant a la Steve Ballmer's "developers, developers" chant:

original: https://www.youtube.com/watch?v=Vhh_GeBPOhs

remix: https://www.youtube.com/watch?v=rRm0NDo1CiY


> nobody will actively dislike a 4K monitor

In addition to what thomaslord said [ https://news.ycombinator.com/item?id=23554382 ], 4K monitors tend to be widescreen. 1280x1024 user here; you'll take my aspect ratio when you pry it from my cold, dead hands.


I also found that the 16:10 is a better ratio, than 16:9, when working with code.

I wish there would be more 1920x1200 or 2560x1600 displays...

Lacking those options at reasonable prices leaves me with the option of using rotated 2K monitors, though 27" ones are uncomfortably tall and 24-25" ones are a little too high in pixel density.

Huawei's Matebook's 3K (3000x2000) display with its 3:2 (1.5) ratio at almost 14" size is an amazing sweet spot though. I wish they would have it in 28" size for desktop use :)

In the end I'm working on 27" 2K and 5K iMacs for about a decade now and I see it as a good compromise, despite the aspect ratio, but I quite like my 30" HP ZR30w with its 2560x1200 resolution. Unfortunately its backlight is getting old and uneven :(


Speaking of 3:2 ratio, we have to mention the Microsoft Surface line of tablets, laptops and convertibles.

Definitely overpriced for a developer machine, but there is also the Surface Studio 2, with its 4500 x 3000, 28 inch, 192 DPI display, 192 being exactly 2x the "standard" 96 DPI.

https://en.wikipedia.org/wiki/Surface_Studio_2

https://en.wikipedia.org/wiki/Microsoft_Surface


ah, and one solution to the font sharpness issue is to use bitmap fonts, indeed. I haven't found good ones though which are available at 16px or above sizes, which are necessary for 2K or 4K displays. Also they are typically quite wide and don't have narrow options.

I found an interesting middle ground though. The old AmigaOS Topaz font as TTF :D http://www.geekometry.com/2013/11/amiga-topaz-truetype-font-...

Joking aside, the narrower variants of the Input font looks pretty good with anti-aliasing: https://input.fontbureau.com/preview/


I have a 1920x1200 at home, 16:10 instead of the usual 16:9. I feel the same way, I won't be upgrading that monitor until it dies. That little bit of extra vertical space is so welcome.


What if I gave you more vertical pixels? Would that be okay? 1280x1440 maybe? Just a few more.


Nope, bad aspect ratio. Let me know if you find a 1920x1536, though. (1600x1280 would also be nice, and a 2560x2048 would be a 4K monitor that's actually useful, although thomaslord's objections still apply.)

Edit: actually, wouldn't 1440x1280 (aspect ratio 1.125, which is even less than my usual 1.25) be a better use of such a LCD panel, rather than a obviously-useless 1280x1440 (with a aspect ratio less than one)?


Maybe. I did use 1280x1024 for a year or two back in the day. Mostly I have found vertical pixels to be more prized for reading, software dev, etc.


Why do you care about aspect ratio so much?


OCD


Yeah, pretty much. Also, standard video aspect ratios like 1280x720 fit the full width of the monitor with space above and below for toolbars and video player UI, which is... basically impossible to convey how frustrating it is to watch video on a widescreen monitor when you've tried a proper one for comparison. Any kind of fullscreen (ie maximized window) games or multimedia is similar.


I understand the bit about a video players, but you're not going to want to play a game in anything except exclusive fullscreen due to the massive input latency incurred when using windowed or borderless windowed.


Depends on the games you're playing, honestly. That "massive" latency is about 30ms from what I can find, which you're not really going to notice when playing a game like civilization or hearthstone.


We could still be programming at 640x480 like we did back in the early 90s. Thankfully, technology marched on and we went to 1024x768 and then even better resolutions. One would think that technology would march forward until we had ink on paper-resolution displays, because why not? But of course, we could still make do with 640x480 like we did way back when.

The bigger issue is that, like we've been doing for the last 40 years, a cutting edge 2X display should eventually become the new 1X display, while the former 1X's become obsolete. This makes building software a bit easier, even with resolution independence, it is difficult supporting more than 2 generations of display technologies (e.g. right now we have 100 PPI, 150 PPI, 220 PPI, even 300+ PPI at the bleeding edge).


I get what you're saying. In general, though, I disagree with the notion that newer tech makes old tech automatically -- always -- not good enough. It's the mentality that makes a lot of people rush to buy the latest Kindle (or mobile phone, or whatever) when the one they own does everything they need.

More importantly, it's one thing to say "I prefer this new tech/resolution/gadget" and another to claim "how can you work like this?" (Where "this" can be "without a mac", "without a mechanical keyboard", "without a 4K monitor", "without three 4K monitors", etc).

Software is developed successfully without any of those. It's not only not an impediment, it's not like a sort of martyrdom either. So no, it's not time to upgrade that monitor.

Someone else in this thread commented that for many devs, their gadgets are a proxy for actual skill. It's easier to show you have a 5K monitor "like all hackers should" than to actually be a good developer.


I just like 200+ PPI displays (24" 4K is the same pixel density as 27" 5K) because I have to spend a lot of time reading text and like not seeing pixels and/or anti-aliasing artifacts. Once you get used to it, going back is like going back from an iPhone 4 to some Windows Mobile 6.1 phone.

It should also be cheap enough now, and at some point fabbing a 200+ PPI display will cost the same as an older one, it won't make much sense to keep making the lower res displays anymore (like it doesn't make sense to fab 32MB DIMMs anymore).


Don't get me wrong, I agree when the tech gets better, with none or few of the downsides others mentioned, going forward is the only direction. I won't actively shop for an older tech once the current one gives up the ghost.

I just disagree that it's "time to upgrade your monitor", or that the current tech is bad or makes it hard to be productive.


After trying virtually every display setup imaginable over the years, I've settled on a single, large, high-resolution display as the optimal setup. While I have written a lot of code on small, low-resolution displays, there are definitely disadvantages to that.

I don't understand the value of multiple displays for code work. Just about everything that I could do with multiple displays can be more ergonomically achieved with multiple logical desktops. I can swap a desktop with a keystroke rather than pivoting, and that workflow translates well from laptops to large workstations.


I use an over the top amount of monitors to... monitor. The purpose is to be able to look at dynamically updating information without touching my mouse or keyboard. i have found myself making an omelette while looking at a large amount of displays in the distance ready to panic-leap into action.

When I code I notice a few things about the multimonitor setup. My cursor can travel 10 physical feet of screen space and it's literally inefficient to point. The ergonomics are terrible. The entire setup basically feeds inattention and an ever-sprawling amount of crap being opened. More than that I think having multiple large rectangles shooting light into your eyeballs kind of screws with your circadian rhythms.

When I code I use a laptop and enjoy how briefly my hands leave the keyboard before returning because what little pointing i'm doing can be done extremely quickly because of the small size of my workflow. Bigger is better is oversold - a workspace should match the work - too small is just more common than too big.


Yes, I find multiple monitors very engaging and also way too distracting.


>The best developers I've known wrote code with tiny laptops with poor 1366x768 displays.

I'm sorry to say this but that is not a real argument.


It's not meant as an argument, but as an illustration: people who I respect more than the author of TFA, and whose skill and accomplishments I've witnessed first hand, weren't eager to upgrade from tiny screens. It's not that they would've refused a better screen if given one, it's just that they didn't ask for it and it wasn't a pressing issue. Indeed, for them it was not "time to upgrade the monitor".

It taught me the valuable lesson that creating software, to them, was mostly about thinking, not about the gadgets you use to write it down.

Like another commenter said, it seems gadgets work as proxies for talent for some developers -- "you should use a mechanical keyboard and a 5K three-monitor setup, like real hackers do" -- because actual talent is harder to gain and demonstrate. A sort of cargo culting by accumulating tech gadgets.


Why are you talking about screen size when the article is talking about pixel density? You can have a lot of pixels but low pixel density because your screen is huge, while my first 200+ PPI display was on an iPhone 4, my watch has a 200+ PPI display as well. I don't think many would accept phones or watches with lower pixel densities these days, but somehow the display we use for work is different?

You are right that there is a cargo culting going on, but it is going on both sides. There are a lot of strawmen out there as well.


I'm talking about tiny screens and resolutions, so it's both. I don't see the strawmen.

Phones and watches are a different use case, so I'm not sure what you mean. I don't type code in either.


You are still conflating display size with pixel density, so I’m not sure what you are arguing against. You talked about in other post about how you had to scale your displays because your OS didn’t support resolution independence, is that what you mean by tiny?


I'm really baffled by your comment. I never said that about scaling or my OS not supporting resolution independence. I think you're confusing me with someone else, which might explain the disconnect between your posts and mine. (It's understandable because my initial post seems to have gathered a ton of responses, so it's easy to get confused about who said what. Probably my most successful (?) comment in all my years here.)

> I’m not sure what you are arguing against

Indeed.


I have a hard time using three monitors effectively, so that in the end they are distracting for me. Probably you need some particular personality tics to make proper use of them.


My personal way of organizing 3 monitors:

1. Main screen. For the code editor, intense web browsing, etc.

2. Secondary screen. For debugging visuals (since I work on web stuff, it usually hosts a Chromium window; for a mobile dev, I imagine it would be an emulator/simulator), documentation referencing (with the code editor open on the main screen), etc.

3. Third screen. For all comms-related things: MS Teams/Outlook/Discord/etc.

I didn't mention terminal, because I prefer a quake-style sliding terminal. For a lot of devs, I imagine that having a terminal on their secondary screen permanently would work great as well.

P.S. Not that long ago, I realized that the physical positioning of monitors matters a lot (to me, at least) as well. I used to have 2 of them in landscape orientation side-by-side and one in portrait orientation to the side. It was fine, but didn't feel cohesive, and I definitely felt some unease. Finally got a tall mounting pole, and now I have the landscape oriented monitors one on top of each other instead of side-by-side (with the rest of the setup being the same). That was a noticeable improvement to me, as it felt like things finally clicked perfectly in my head.


To clarify, my beef is with Atwood's opinion (and similar opinions) that you must use a three monitor setup, otherwise you're doing something wrong. Of course I understand for many devs this setup works, in which case more power to them!

I just dislike being told unless I follow these fads I'm a subpar developer. I don't own a mechanical keyboard or a three monitor setup. I don't own a 4K monitor (I suppose I eventually will, when they become the norm). When Apple came up with retina displays, I didn't feel I had magically become unable to write code because my display at the time was 1440x900.


> my beef is with Atwood's opinion (and similar opinions) that you must use a three monitor setup

It's weird to me to specify the number of monitors given how they come in a vast range of shapes and sizes.

For example, my dream setup used to be a single 55" curved 8K monitor. That's the rough equivalent of a 2x2 grid of 27" 4K monitors (I currently have two 27" 4K side by side in landscape @ 1:1 scaling).

The only problem with my so-called dream set up though is I don't think my computer glasses, which are sharp only for surfaces between a range of 21" to 27" would allow me to see everything sharp from corner to corner on that monitor, which sucks.


as a general rule I like to have a computer that is sort the average crap that you think a person might have around who does not care that much about computers so then if the stuff I make works on that I know it's going to work on the upscale stuff as well.

Also then when one of my disastrous kids destroys it I don't feel bad.

on edit: fixed typo


I used to use this exact setup, but specifically eliminated monitor #3 as I felt it was counterproductive to have an "always on" comms monitor. These days my main monitor has one workspace, while my secondary has the normal secondary stuff in one workspace, and comms in another.

I found it to be less distracting and the two screens are more physically manageable, and easier to replicate if i change setting (cheaper too!). The only thing I will change is whether I'm in landscape/landscape or landscape/portrait. I can never make up my mind about what I prefer.


This is my exact layout too! Though the screen with the code editor is ultra wide, so with window tiling I have the editor and the terminal side by side


Personally I use web browser on the left (eg docs), editor in the middle and output of whatever I'm doing on the right.

Email hides behind the web browser and slack behind the output.


My laptop has a 4K screen, but I switch to an external 27" QHD (2560x1440) display when I'm at my desk and boy ... I used to like that external monitor. Now all I can see, all I notice, is how much worse and more pixelated text looks on it than when I'm reading on the laptop's screen. It practically looks blurry!

So yeah, nobody needs a 4K display. But I would think most people don't know what they're missing. (:


I think lots of fonts actually look worse on hidpi screens, especially on Windows, probably because these fonts were specifically designed for low resolution screens. And most applications use hard-coded fonts that you can't change. Text is maybe a little easier to read on hidpi screens, but it is less aesthetically pleasing.

Linux can be made to look good in that respect, but the situation for scaling legacy applications is pretty bad.


Yes, this. Even if you have only one hidpi screen, Windows still looks bad, because the fonts look terrible when rendered on a hidpi screen. This is one thing keeping me on "lodpi" screens for now.


Yup, this kind of stuff is very much based on preference. I think some people have tighter comfort deadbands and others are tolerant of anything. These can be the same person, on different days / differing amounts of sleep / different tasks / different lunches / different music.

i.e. I have a three monitor setup, two 1080 screens and a central 1440p screen. Mechanical cherry blue keyboard and a nice gaming mouse. Big desk. One of those anti-RSI boob mousemats. Nice headphones. Beefy computer. Ergonomic office chair. Everything at the right height. Quiet room. Next to an open window, but which is in shade.

Some days I will happily chug away writing reams of code.

Other days I will get nothing done and every minor annoyance will bring me out of focus; background sounds, glitches in my desktop environment, almost inconceivable hitches in computer performance, my chair feels wrong, the contrast of all text is too high, it's too dark in here, it's too bright, this is all bullshit, I'm so done with computers...

And on other days I'll get even more done, hunched like a Goblin in my living room armchair with my legs tucked beneath me, working through to the early morning, on my tiny 13 inch dell laptop, vaguely aware that I have searing pain* through one wrist, having not eaten for 12 hours, but completely and utterly content and in flow writing code or a paper.

Humans are crazy fickle.

There's no Platonic Ideals when it comes to Crushing It(tm) as a software developer.

I think learning the art of mindfulness and meditation does more to help you focus than expensive equipment fads and micro-optimizing your environment.

* Before anyone mentions, yeah I'm aware RSI is Not Good. I've got a bunch of various remedies for it. Ironically I've found that bouldering helps with this it the most. Just not able to do that during lockdown :(


Acclimation is a huge factor: if you’re used to writing on a small laptop that will seem normal but few people won’t see a benefit moving to a larger display even if they don’t expect it. That’s one of the few durable research findings over the decades.

Multiple monitors are slightly different: the physical gap means it’s not a seamless switch and not every task benefits.


> Multiple monitors are slightly different: the physical gap means it’s not a seamless switch and not every task benefits.

Probably why ultrawides are gaining popularity.


I agree, and not. I like having many windows side-by-side (code, docs, thing I'm developing), but I also like having more vertical space. I keep one of my monitors vertical for coding, although that's a little too tall so there's some lost space at the top and bottom (9:12 would be better for me).

Really I want a single plus-shaped monitor that can act in several display modes:

- mimic 3 monitors. Left and right are 4:3 (ideal for single application), center is 9:12 and larger. Shape: -|-

- Single vertical monitor. Same as above but with left and right "virtual monitors" turned off for reduced eye strain. Shape: |

- Single ultrawide horizontal monitor (connect the left and right parts with the strip in the middle, turn off unused pixels at the top and bottom). Shape: ---


Vertical space is necessary, that's why I choose the Dell U3818DW[1], It has 1600 pixels vertically

[1]: https://www.dell.com/si/business/p/dell-u3818dw-monitor/pd


One advantage to ultrawides that I think goes understated is their ability to display large amounts of tabular data easily.


Yes — I notice this a lot while doing anything where I have an app and debugger / log windows open simultaneously. The extra width is an enormous win.


> Some people might like them, [some] won't.

By 'like them' are you talking about the full set of tradeoffs, or the raw visuals? Because I'd be very surprised if someone actually disliked how an all-else-equal 4K screen looks, compared to 1080.


Yes, the full set of tradeoffs. My comment is unclear. If someone gave me a 4K monitor for free I'd use it, but not having one is not an impediment for me (or most devs I know).


At least for me - I had a cheap 4k monitor. Managing the size of fonts / windows on it vs the connected laptop was a pita. Not to mention the display /graphics card overloading and glitching once in a while trying to power everything.

So I am still stuck with my regular HD monitor + laptop monitor - which is pretty good.


Well for the hypothetical I'd just have all your screens double in resolution.


I started coding in Windows Visual Studio again recently after decades of absence, and the enormous length of their variables, function names, and function parameter lists, has me >120 columns again. All my non windows code is 80 (or 100 in very rare cases).

So now I need a wider monitor. I was on an NEC 6FG until 2011 and then went to a small dell flat panel 1280x1024. I suppose it is time to join the 2020's.

I have multiple VNC and RemoteDesktops up at any moment, so maybe I'll really splurge and go for one of those fancy curved ones.

This article is written in a style i can really appreciate. That's a lot of research.


X230 at 1366x768 checking in, can confirm, no problem at all.

I'm primarily a C programmer. 1366x768 limits my functions to about 35 lines at 100 columns wide. My font is Terminus, sized at 14.

I thank my x230 for enforcing me into a paradigm where functions are short and to the point. I watch my co-workers flip their monitors vertically and write code that goes on for days. While it's not my style, I'm thankful I've been forced to make do of limited space, and it has trained me well.


I agree with you, 1080p on a 23-24" screen is great for programming. I have a Acer Predator X34P at home (34" ultrawide 1440p @ 120hz), and for games it is AMAZING. But for programming? I prefer 2-3x 1080p at around 23-24".

In the last 3 months I went from 1 screen to 2 to 3 as I progressed through particular stages of development. When we 'go live' I'll easily be using all 3, but I don't _really_ need them, it's a nice to have.


Yes, you're completely right. I remember when the Atwood article came out, and I was convinced I needed multiple monitors to be productive (I was quite junior) - it even caused me stress when I worked at places that didn't have multiple monitors. These days I'm pretty happy with a single screen - including feeling very productive on just a laptop screen.

It turns out the whole idea of needing tons of desktop space was vastly exaggerated. You can't focus on 3 screens at once, it is impossible. Even on a large screen you can only look at one place at a time. Programmers are focused on one thing for a long period of time anyways - your code editor. Maybe a terminal as well that you need to swap to occasionally. Having multiple monitors has limited benefits.

Where multiple monitors is helpful is if you're an operator - you are watching many dashboards at once to observe if things break or change. Traders for example will have workstations with 9 screens at once. But this makes sense - they are not focusing on one thing at a time, they need to quickly look at multiple things for anything unusual.

I've personally found that using a tiling WM gives me vastly more benefits than multiple monitors ever would. Although I was also pretty happy using Gnome's super-tab/super functionality during my non-tiling days.

4K is sort of a nice to have, but hardly a requirement. I'm as happy on a normal DPI screen as I am on a retina screen. Our brains adapt very quickly and unless the text is overly blurry (because of poor hinting implementations), a regular DPI screen is perfectly fine for coding on. My own personal preference is high refresh rates over anything else - I can't stand latency.


I actually like monospace pixel fonts. Doubt it would make a difference. I have a (cheap) 4k display at home. Some things might look better, but I would recommend to use one with a high refresh rate and good contrast.

Since I used ebook readers with their displays, reading on a monitor is always a compromise in comparison.

Wouldn't want to go lower than 1080 though. It can work of course, but it just doesn't have to be.


We used to write great code on the original Mac’s 512 by 384 screen, doesn’t mean we were very productive at it.

A 5K monitor allows you to view an enormous set of workspaces and code, and sharply. It will cost you less than $300 a year, which for any decent developer is less than 1/2 of 1% of your annual revenues (and its tax deductible for contractors!).

There is no way the increase on productivity doesn’t far outweigh that cost.


What do you mean "we weren't very productive"? People were plenty productive with the original Mac.

I was plenty productive on PC XT clone with a tiny monochrome CRT monitor and an Hercules video card. (I wanted to say I was plenty productive with my C64, but that'd be stretching it :P )

What I want to fight is this notion that once tech N+1 arrives, then tech N must have been unproductive. That's demonstrably false, a form of magical thinking, and not a healthy mindset in my opinion.


But it's true. Developing modern code on a Apple 2 is unproductive. It's suboptimal, and worse. Maybe you don't want to use the word "unproductive," and that's fine. But larger screens, faster computers, and more expressive programming languages all contribute to increased productivity over the older alternatives.


> Developing modern code on a Apple 2 is unproductive

Are we talking about developing modern code on an Apple 2 now? If so I misunderstood the parent post, because they were using the past tense.

In any case, consider this: how do you feel about your current suboptimal, unproductive hardware & software of choice right now? Your current programming language and monitor resolution cannot be used to code. It's simply not possible to be productive with them.

Think I'm exaggerating? Ok. Fast forward a few years and let's talk about this again. Maybe you'll understand my frustration...


No, I mean, it can. It's just going to (probably) be less productive than what exists in the future. All I'm saying is that the productive/unproductive dichotomy isn't useful and we should instead talk about productivity as a continuum.


> All I'm saying is that the productive/unproductive dichotomy isn't useful

Then we agree! I was pushing back against the notion that one cannot be productive unless using a { 5K monitor; three-monitor setup; mechanical keyboard; $YOUR_FAVORITE_TECH_GADGET HERE }. In other words: it's not "time to upgrade your monitor" ;)


Productivity = work/time

It’s indisputable that modern screens, tools, etc make developers more productive than smaller screens, slower CPUs did.

I used to have 30 minute compile times. Was I “productive “? Sure, I guess, but 20 second compile times make me more productive.

And being able to see over one hundred lines of code side by side with documentation makes me even more productive.


I agree better tools have the potential to make us more productive. I also think some nerd/hacker types tend to overrate the importance of their tools and gadgets, because this stuff serves as a proxy for actual talent and it's easier to show off and compare (These are not my words, someone else said it elsewhere in this thread, but once I read it, it really clicked with me).

It's one thing to say "I'm more productive with $GADGET". It's another, different thing to say "we weren't productive before $GADGET". I assure some of the old hackers, back in the days when they coded on a PDP-11 (or an Apple II, or whatever) where more productive than me and -- without knowing you -- I'm willing to bet they could be more productive than you. In my opinion the mindset that old tech wasn't "good enough", that leads people to claim "we weren't productive before" in absolute terms, is really pernicious and unhealthy.

Mind you, I'm not saying all gadgets are equal. Faster CPUs to me are obviously more useful than a larger screen. Both are nice, but one is more important than the other.

However, the most important peripheral, the one that makes the most impact on productivity: my brain. And I can't show it off as easily as a brand new macbook or whatever.


> It will cost you less than $300 a year

Don't forget the cost of buying a new computer that can drive that display.

None of our laptops even have DisplayPort outputs let alone sufficient onboard graphics for that resolution.


The cost of buying a new high end laptop is trivial in relation to your work value.

I used to run a dev group of 40 people. Every developer got fastest possible pc with big screen monitor back when both were expensive, and their own office with a door. Added about 1% to the cost of each developer, but the benefit to productivity was at least 10-20 times that.

And Not even counting benefits of reduced turnover.


An Intel iGPU can run dual 4K displays just fine, if it has dual channel RAM.

I'm still upset at Lenovo thinking 16 GB should come in one RAM stick.

If you have a Thunderbolt USB-C port then you have a DisplayPort. Some monitors accept it directly and will act as a charger and docking station. Others need a fairly inexpensive USB-C to DP adapter.


All Macs drive it.


I too use a single 1080p monitor. I find that, for me, the benefits of multiple monitors aren't really there.

I tried it for a while but I find virtual desktops cover my multi-tasking needs just fine.

When I had multiple displays the context switching involved in moving my head and focusing on the other displays wasn't great.

I guess if I was a designer it would make more sense but for coding what I have now works fine.


For me it all comes down to display size and density, and therefore out of all the monitors I've tried, I'd go 27" at either 5k or 2.5k resolution.

For me 27" is the perfect display size: - Any bigger and I find I have to move my head to much to be comfortable in the long term and I get a stiff neck. - Any smaller and I can't fit enough on the screen at once and my productive decreases.

2.5k (1440x2560) on a 27" monitor has pixels at a good size that most apps scale by default beautifully on.

5k is exactly double the density of the already perfect 2.5k, so (at least of MacOS) everything displays exactly as it does with 2.5k except really crisp!

Therefore, I now run the 27" LG 5k as my primary screen, and my old 27" 2.5k vertically on the side for all those status like things that it's good to see but don't require my focus. Also, since the size and "looks like" resolution is the same, I can drag windows between them without any changes and size/scale. It just works!


> Any smaller and I can't fit enough on the screen at once and my productive decreases.

But that's nothing to do with physical size and everything to do with resolution.


I don’t mean to pick on you personally, but what a stupid argument that is.

How is more pixels and more legible text not strictly better than lower resolution monitors?

It’s like preferring 10 DPI mice to the modern 1000+ DPI ones, saying that PCIe 4x was better than 8x or actively disliking a modern DSL connection compared to a 56k dialup.

Yours is just an idiotic stance to feel superior than everybody else.


I also don’t mean to pick on you personally, but do you think that 8x as stupid is better than 4x as stupid?

Not everything is a numbers game.


> The technical details are all right (or seem right to me, anyway), but this is too opinionated for my liking.

The author, Nikita, did exactly mention that point, mid page of the article under the heading of "Get a monitor".

   Let me express an opinion. This is my blog, after all. 
So yeah, opinionated piece, no big deal. And I happens to agree with him.


I built the majority of the software for my first startup on a tiny under-powered netbook, since I couldn't afford a more expensive laptop. Today I have a very wide monitor for work. Agree that the display size doesn't seem to matter that much, but you do have to learn how to take advantage of features like multiple desktops, etc.


My personal setup:

1. Top-tier Macbook Pro 2. Portable Macbook (Air)

I used to have #1 with two large-ish monitors and I hated it. Two monitors are effectively a single 2xW monitor and that means moving your head from side to side constantly. Having them X inches above your keyboard means moving your head up and down (I touch type but need to look at the keyboard occasionally).

A laptop means a hi-res screen and a keyboard in close proximity. Instead of moving your head around you can use Spaces.

The second laptop can act as a second screen if you need it — as well as a coffee shop runner and a backup if your main machine is down or getting serviced.

It’s the same cost or less than a powerful laptop and two external monitors, assuming you are buying hi-res.

I’ll never go back to a 2-3 monitor setup.


Another conclusion I disagree with is the author's insistence on only using integer scaling multipliers. The article talks about subpixel rendering and then mostly ignores that for the rest of the article, instead assuming pixels are all small black squares (which it isn't - "pixel density" isn't a particularly meaningful term when you're dealing with clarity on modern screens). IIRC, macOS renders text on a higher-resolution internal framebuffer and scales down, anyway; in practice I don't notice any difference in clarity on my MacBook screen whether running at an integer or non-integer scale of the native resolution.


Funny you said that about Jeff. I read his article just recently and also happened to use three monitors in my office for a while.

I started developing neck pain, so I had to go back to one.

At least buying a 4k/120hz monitor is not painful to your body, only your wallet.


> The best developers I've known wrote code with tiny laptops with poor 1366x768 displays. They didn't think it was an impediment.

Everytime I read something like this I cringe. Ergonomics is terrible with laptops. There are no osha approved laptops.

Those developers should sit in a good chair, with a properly positioned keyboard monitor and body, and head off the problems their body will encounter. (or is encountered but ignored)


Had this debate when we bought monitors for the firm. Some in management said high DPI, good color etc. are not worth the money (which you clarify is what your really mean). But after splurging, the effect on productivity was measurable. If you pay someone to stare at a box for 8h, a nicer box pays for itself pretty quickly.

Personally I don’t mind a small UI so I currently just run (macOS) on 3840 × 2160 on two 24” LGs. With size 11 Fira this gives me 8 80char columns.


> The technical details are all right (or seem right to me, anyway)

No, not universally. If I follow the advice and disable font smoothing, it looks worse than any font display I've ever used, including 8 bit home computers. https://i.imgur.com/Y0xkhTb.png


My mastery as a software engineere doesn't stop at my tools.

I look at my screen most of the time. I'm not buying cheap screens anymore. It is just not worth it.


This is a perfectly fine point of view.

I just refuse to be told that I must use a { three-monitor setup, mechanical keyboard, 4K monitor, macbook, $LATEST_FAD } or otherwise I'm using a subpar development environment.

I still remember when Full HD monitors were all the rage. Nobody considered them cheap tools back then, and somehow code got written and developers were happy. There will come a time when someone on the internet will tell you that a 4K monitor is a cheap tool and that nobody can properly code using one.


I have a 2560x1440 monitor at home and 2x 4K monitors at work. I like my home setup better because the tools I spend most time with (vscode and Xcode) lock all their panels to one window making the real estate more important then the pixel density. I still have not found a way to get a 4K display not to display in “retina” 1080p without it getting too small, blurry or laggy.


I believe that like the internet (insert ironic emoji here), Al Gore started the 3 monitor trend: https://macdailynews.com/2007/05/20/al_gores_three_30_inch_a...


Agreed. Was supplied two 4ks for my last job and just ended up using so zoomed in it wasn't even like I was using 4k monitors anymore.

My ideal setup is two 1440p monitors, but even then I tend to zoom in quite a bit.

I also had a coworker who was fine with their laptop screen. People you are finicky with these kinds of things seem to have their focus elsewhere over the actual work done.


Yep - I care a lot about my monitor and spared no expense but got a 1440p 60Hz because I care more about color.


I write code on 1366x768 and 1080p and see no difference. 1080p is slightly bigger, but that's all.


Aside from the added screen real estate, I find my external 1080p LG monitor simply looks better than my Dell Latitude's. The laptop display just looks... cheaper. It's hard to explain, it just looks worse.


Well, maybe it's because it's worse :D


Definitely. To be honest I've never seen a quality 1366x768 laptop screen, because the low res [1] usually goes hand in hand with cheap laptops.

[1] it's hilarious that 1366x768 is called "HD", because marketing. It wasn't high definition even on the day it was introduced.


The whole "HD ready" thing was an utterly sad part of the story of cinema/television that let's hope will never repeat.

As for the 1366x768, for some reason for many, many years the wide majority of monitors stopped there, with "Full HD" coming to the masses (but not all) only a few years ago.

"1366x768" meant "half baked HD" from the day it was introduced, so it's understanding that few quality monitors (and especially laptop screens) were made with that resolution.


The author is biased because of ligatures. These special glyphs are a little more dense than normal text, e.g the triple line equals and need a higher res display to look good. I personally dislike ligatures, I have no problem chunking two characters together in my head.


> e.g the triple line equals and need a higher res display to look good.

Do you mean '≡' (U+2261 identical to)? Because no, it very much doesn't; in fact I'm not sure I've ever seen a font where it looked bad short of vertical misalignment with surrounding text.


It looked bad in the article.


Okay, tracked down this[0], which is probably what you were talking about(?), and what the actual fuck is wrong with that rendering engine? You render a Apx horizontal line, Bpx of space, a second Apx line, Bpx of space, and the third Apx line, where A=B=1. Maybe you use 2px of space. Maybe, if you're rendering obnoxiously huge, you use a 2px line and however many pixels of space. You don't fucking use real-valued vector coordinates and then blindly snap them to integers, or smear bits of gray anti-aliasing around the screen like some kind of monkey with a hand full of it's own feces.

If OSX is actually doing stupid shit like that, it's no wonder sane-resolution moniters look like crap. Stupid-absurdres monitors will look just as bad, because the problem is the operating system driving them, not the monitor.

0: https://tonsky.me/blog/monitors/hinting_failures.png


> The best developers I've known wrote code with tiny laptops with poor 1366x768 displays.

And first operating systems and unix core utils were most likely written using much simpler tools than what developers have today at their disposal.


I'd take a 2560x1440 monitor (or 2 or 3!) over 1080p displays that the author seems to prefer, even if those 1080p display are actually 4k pixel-wise...

Being able to see more things >>>>>>> sharper text, once you're past 100dpi or so at least.


> No. Just no. The best developers I've known wrote code with tiny laptops […] it's no big deal.

I coded on a MacBook Air 11" on daily basis during 1 year. I thought it was no big deal too until I started developing myopia.


Are you talking about physical screen size, DPI or what? I've no idea what display a MacBook Air 11" has.

Did your eye doctor tell you your myopia developed because of staring at your laptop's screen? (Mine developed when I was a teenager and hadn't yet spent a lifetime of staring at CRT monitors, and my eye doctor told me it was likely unrelated to computer screens. But maybe medicine has changed its opinion, this was decades ago).


I love the old CRT text mode. I don't need fine anti-aliased font for my text, I find those fine fonts are too much for me. What's wrong with some non-perfect edges?


You as a coder also don't normally need 120 Hz display like the OP and to be limited to !3! models currently available on the market. It doesn't make sense. But hey, economics, whatever...


Amazing how many militant replies you are getting. I don’t care for 4k & 5k displays. I find it more tiering to work on all day some how.

But based on the replies, I am wrong and probably retarded ;)


In my experience 1366x768 is tight, but 1440x900 is already quite good


Yes you can code well on tiny laptops with poor 1366x768 displays - but double or triple set ups are more efficient.

Its more about maximising productivity in my opinion.


Oh, no argument from me. I said I knew plenty of people I admire, and who are way better than me profesionally speaking, who can code with 1366x768 displays and dismiss the urgency of getting a better screen. In this sense I meant that it's not time to upgrade your monitor: it's simply not that urgent to own the highest DPI screen you can buy.

Me? I can certainly code with a poor quality 1366x768 display (doing it right now with this Dell Inspiron!) but I hate it. I'm not a 4K snob, but I do like better screens!


You'll dislike a HDPI display in a bunch of linux distros; 100% assurance on that.


1440p, 75hz display is perfectly acceptable. If you game, the quality difference between 1440p and 4k is pretty much indistinguishable, but 1440p is much less performance intensive to render.

I upgraded from a 32" 1080p display to a 32qk500. A significant improvement in visual quality for a reasonable price (~USD$270)


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: