The technical details are all right (or seem right to me, anyway), but this is too opinionated for my liking. No, you do not need a 4K monitor for software development. Some people might like them, some won't. [edit/clarification: someone rightfully pointed out that nobody will actively dislike a 4K monitor. I was unclear here: I meant "some people won't need them" more than "dislike them"]
This sounds like when Jeff Atwood started that fad that if you didn't have three external monitors (yes, three) then your setup was suboptimal and you should be ashamed of yourself.
No. Just no. The best developers I've known wrote code with tiny laptops with poor 1366x768 displays. They didn't think it was an impediment. Now I'm typing this on one of these displays, and it's terrible and I hate it (I usually use an external 1080p monitor), but it's also no big deal.
A 1080p monitor is enough for me. I don't need a 4K monitor. I like how it renders the font. We can argue all day about clear font rendering techniques and whatnot, but if it looks good enough for me and many others, why bother?
Hello! Person who actively dislikes 4k here. In my experience:
1. No matter what operating system you're on, you'll eventually run into an application that doesn't render in high dpi mode. Depending on the OS that can mean it renders tiny, or that the whole things is super ugly and pixelated (WAY worse than on a native 1080p display)
2. If the 4k screen is on your laptop, good luck ever having a decent experience plugging in a 1080p monitor. Also good luck having anyone's random spare monitor be 4k.
3. Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming. I use i3 and it adds way more productivity to my workflow than "My fonts are almost imperceptively sharper" ever could
My setup is 2x24" 1920x1200 monitors - so I get slightly more vertical pixels than true 1080p, but in the form of screen real estate rather than improved density. I also have 20/20 vision as of the last time I was tested.
My argument in favor of 1080p is that I find text to just be... completely readable. At various sizes, in various fonts, whatever syntax highlighting colors you want to use. Can you see the pixels in the font on my 24" 1080p monitor if you put your face 3" from the screen? Absolutely. Do I notice them day to day? Absolutely not.
I genuinely think 4k provides no real benefit to me as a developer unless the screen is 27" or higher, because increased pixel density just isn't required. If more pixels meant slightly higher density but also came with more usable screen real estate, that'd be what made the difference for me.
> 1. No matter what operating system you're on, you'll eventually run into an application that doesn't render in high dpi mode
You lost me right here on line 1.
If there are apps on MacOS that can't handle high dpi mode, I haven't run into them as a developer (or doing photo editing, video editing, plus whatever other hobbies I do). Also, I don't have any trouble with plugging my highDPI MacBook into a crappy 1080p display at work.
> 3. Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming.
Things like this are exactly why I left Linux for MacOS. I absolutely get why you might want to stick with Linux, but this is a Linux + HighDPI issue (maybe a Windows + highDPI issue also), not a general case.
> I genuinely think 4k provides no real benefit to me as a developer unless the screen is 27" or higher, because increased pixel density just isn't required.
You could say the same for any arbitrary DPI; 96dpi isn't "Required", we got by fine with 72dpi. It's all about ergonomics as far as I'm concerned.
I'm on Windows and can confirm 1. is an issue. Windows lets users scale UIs to be readable at high DPI on small screens. Doesn't work on all UIs (e.g. QGIS). So maybe not all OS's, but two important ones.
> You could say the same for any arbitrary DPI; 96dpi isn't "Required", we got by fine with 72dpi. It's all about ergonomics as far as I'm concerned.
I think the point the parent is making is that human vision has limited resolution. I.e. for a given screen size & distance from the screen, you cannot notice any difference in DPI past a point. The parent is suggesting that 1080p & 27" with a typical viewing distance is already higher resolution than the eye can resolve. Looking at my 1080p 27" screen from a metre away with 20/20 vision I am inclined to agree!
>I think the point the parent is making is that human vision has limited resolution. I.e. for a given screen size & distance from the screen, you cannot notice any difference in DPI past a point. The parent is suggesting that 1080p & 27" with a typical viewing distance is already higher resolution than the eye can resolve. Looking at my 1080p 27" screen from a metre away with 20/20 vision I am inclined to agree!
Are you sure you have 20/20 vision? I can absolutely resolve individual pixels with zero effort whatsoever on 1080p 27-inch displays.
Back when I had a 27-inch 1080p display at work, my MacBook's 13-inch Retina Display effectively became my main monitor. The 27-inch monitor was relegated to displaying documentation and secondary content, because I found its low resolution totally eye straining
Edit: I might have found it so eye straining because MacOS does not support sub pixel rendering. That means a lot of people will need a 4K or Retina monitor to have a comfortable viewing experience on the Mac.
MacOS does support subpixel rendering, has at least since the early to mid 2000s. One or two versions back though they turned it off by default since it isn't necessary on HiDPI "Retina" displays and they only ship HiDPI displays now.
You can still turn it on although it requires the command line.
Subpixel rendering dramatically slows down rendering text. When you have a high res screen, and want everything to be 120fps, even text rendering starts to be a bottleneck.
That combined with the fairly massive software complexity of subpixel rendering is probably why mac dropped it.
Been a while since my eyesight tested, but I think so! I can see pixels if I focus but not when reading text at any speed. I have also checked and my display is only 24" (could've sworn it was more!) so maybe that's why. I retract my comment :)
> I think the point the parent is making is that human vision has limited resolution.
If you can't see the difference between 4k and 1080p on a 24" monitor, then you probably need reading glasses. On a 27" monitor it's even worse. It's not so much that you can "see" the pixels, sub pixel rendering and anti-aliasing go a long way to making the actual blocky pixels go away, the difference is crisp letters versus blurry ones.
Yes, I can see the difference, but I (personally) don't notice that difference while reading. I do notice a big difference when using older monitors with lower DPI compared with 1080p on a normal-sized desk monitor, however.
> I'm on Windows and can confirm 1. is an issue. Windows lets users scale UIs to be readable at high DPI on small screens. Doesn't work on all UIs (e.g. QGIS). So maybe not all OS's, but two important ones.
Haven't seen any scaling issues on Windows in years. Last time was Inkscape but they fixed that.
I see these issues all the time, with enterprise desktop apps. The scaling is only really a problem because it is enabled by default when you plug in certain displays. If the user made a conscious choice (which they would easily remember if they had trouble), it would be fine.
For many, many years there were at the very most 120 dpi monitors, with almost all being 96, and I imagine a lot of enterprise applications have those two values (maybe 72 as well) hard-coded and don't behave properly with anything else.
I'm currently working from home, accessing my Windows 10 desktop machine in the office via Microsoft's own Remote Desktop over a VPN connection. This works fine on my old 1920x1280 17" laptop, but connecting from my new 4k 15" laptop runs into quite a few edge cases, and plugging an external non-4k monitor has led to at least two unworkable situations.
I've now reverted to RDP-ing from my old laptop, and using the newer one for video calls, scrum boards, Spotify and other stuff that doesn't require a VPN connection or access to my dev machine. It mostly works OK in that configuration.
I've seen other weird things happen when using other Terminal Services clients, though.
> Also, I don't have any trouble with plugging my highDPI MacBook into a crappy 1080p display at work.
Low DPI monitors are pretty much unusable since MacOS dropped subpixel rendering - with fonts being a blurry mess. You can only really use MacOS with high DPI monitors now for all day working. It’s a huge problem for everyone I know who wants to plug their MacBook into a normal DPI display. Not that the subpixel/hinting was ever that good - Linux has always had much better font rendering in my opinion across a wider range of displays.
> Low DPI monitors are pretty much unusable since MacOS dropped subpixel rendering
Nonsense, fonts look fine on non-Retina monitors; they were fine on my old 24" 1920x1200 monitor and are fine on my new 27" 2560x1440 one. Can I see a difference if I drag window from the external monitor to the built-in Retina display? Yes, but text is not blurry at all on the external monitor.
If it matters, "Use font smoothing when available" is checked in System Preferences (which only appears to have an effect on the Retina display, not the monitor).
That's been my experience, too. I prefer high-DPI monitors, but back when I was going into the office (remember going into the office?) and connecting my MacBook to a 1920x1200 monitor, text was perfectly readable. I suppose if I had two low-DPI Macs, one running Catalina and one running, I don't know, High Sierra, I might be able to tell the difference at smaller font sizes,
As an aside, I wonder whether the article's explanation of how font hinting works -- I confess for all these years I didn't know the point of "hinting" was to make sure that fonts lined up with a rasterized grid! -- explains why I always found fonts to look a little worse on Windows than MacOS. Not less legible -- arguably hinted fonts are less "fuzzy" than non-hinted fonts on lower-resolution screens, which (I presume) is what people who prefer hinted fonts prefer about them -- but just a little off at smaller sizes. The answer is because they literally are off at smaller sizes.
These things are fairly subjective. But it’s hard to argue that Catalina has good font rendering on regular DPI screens. I dealt with it when I had to, but it was very poor. There are also tons of bugs around it. Like the chroma issue - Apple doesn’t support EDID correctly so fonts look even more terrible on some screens. A google search will confirm these problems.
This is an interesting position. I have always thought that font and font rendering were always an especially pernicious issue with Linux and a relative joy on MacOS?
I think that is a historical artifact. Ubuntu had a set of patches for freetype called infinality, developed around mid 2010, which dramatically improved font rendering. Since then, most of those improvements have been adopted and improved in upstream. [1] Any relatively modern Linux desktop should have very good font rendering.
As with most things Apple, it is a joy as long as you restrict yourself to only plugging the device into official Apple peripherals, preferably ones that are available to buy right now. It’s when you start hooking your Mac up to old hardware or random commodity hardware that the problems surface.
I recently started using Linux some on the same 4K monitor I usually have my Mac connected to. I was shocked at how much sharper and easier to read the text was on Linux.
I have been using a 4k monitor and 2 1080p monitors on linux for a while now. The current state of things is that hidpi works correctly on everything I have run including proprietary apps. I'm also surprised when my wine programs scale properly as well.
What does not work perfect is mixing hidpi and lowdpi screens. On wayland with wayland compatible apps it works fine but on X11 or with xwayland apps like electron it will not scale properly when you move the window to the other screen, it will scale to one screen and be wrong when moved over. Overall I don't find this to be too much of an issue and when chrome gets proper wayland support the problem will be 99% solved.
I can confirm this anecdata. Single and dual 27" 4k is fine, but mixing with a 27" 1440p is messy (tried with GNOME and KDE on Manjaro during early Corona home office).
> I have been using a 4k monitor and 2 1080p monitors on linux for a while now. The current state of things is that hidpi works correctly on everything I have run including proprietary apps.
It's good to hear things aren't as bad as some have suggested.
I think it was bad since when I look through the issue trackers a lot of hidipi bugs were closed less than one year ago but I have not really noticed much other than what I noted about multi monitors.
> If there are apps on MacOS that can't handle high dpi mode, I haven't run into them as a developer (or doing photo editing, video editing, plus whatever other hobbies I do). Also, I don't have any trouble with plugging my highDPI MacBook into a crappy 1080p display at work.
PyCharm had high CPU consumption issues with a Macbook connected to a 4k display running in a scaled resolution. Native 4k was fine, using the default resolution was fine but "more space" made it use tons of CPU since it had to rescale text and UI elements on the CPU.
I think all the jetbrains tools did. I remember a few years ago there was some big switch over (high Sierra maybe?) and the jet brains tool fonts were janky for a while - something to do with the bundled jvm and font rendering. I think it's sorted now, but there was an 'issue' there for a while. Maybe it's still an issue in some configurations?
It's why I actively avoid monitors with small pixels. My trusty old Dell U3011 and the two rotated 1600x1200's flanking it suit me just fine.
I've no inclination to change my OS, either, just for the sake of fonts.
Kudos to those commenters still on CRT displays. One complaint I have with LCD's is reset lag time, which can make it tricky to catch early BIOS messages.
Sure. The fastest I've seen lately is that ARMv7-A board submitted the other day which boots in 0.37 seconds, or with networking, 2.2 seconds. That's time to user land, and to achieve it took highly specialized firmware and a stripped down kernel compiled with unusual options. I've yet to see a PC come anywhere close to that, and personally I won't consider the boot problem solved until a cold one completes quicker than I can turn on a lightbulb.
In fact historically, some of the higher-end hardware yielding the best performance during operation (an axis along which I optimize) actually added time to the boot sequence. The storage subsystem on my workstation is backed by a mix of four Intel enterprise-grade SSD's in RAID-0 (raw speed) and 8 big spinning platters in RAID-6 (capacity), plugged into an Areca 1882ix RAID card w/ 4GB dedicated BBU cache. Unfortunately that card adds a non-bypassable 30 seconds to the boot sequence, no matter what system you plug it into. But once there, it screams. It's only just the last couple years PCI-NVMe drives have come out that can match (or finally beat) the performance metrics I've been hitting for ages.
So I actually kind of feel like I've been living in the future, and the rest of the world just caught up ;-).
If there are apps on MacOS that can't handle high dpi mode, I haven't run into them as a developer (or doing photo editing, video editing, plus whatever other hobbies I do).
Same. I haven’t run into any apps that don’t support high dpi mode. Even terminal apps look great on my Retina 4k iMac screen.
Before getting this machine nearly a year ago, I couldn't natively view high dpi graphics for web projects I’d work on, which was a problem since there are billions of high dpi devices out there.
> If there are apps on MacOS that can't handle high dpi mode, I haven't run into them as a developer (or doing photo editing, video editing, plus whatever other hobbies I do).
The version of pgAdmin not based on electron had pixelated fonts on a 5k iMac. I haven't checked recently.
> If there are apps on MacOS that can't handle high dpi mode, I haven't run into them
Audacity has extremely low framerates and chugging when interacting with the waveform on retina screens. Even running in low DPI mode doesn't fix it. Only runs nice on non-retina displays.
Agreed. I have a retina mbp that I use as the 3rd screen plugged into a 4K monitor over usbc and a 1920x1200 over dp. Everything works fine. Windows stay in the right place when plugging/unplugging, etc... My eyes also thank me every time I look at the 4K text on my main monitor. I’m debating buying another one and dumping the 1920 monitor.
The post I quoted was: "No matter what operating system you're on". I quoted that particular bit for a reason.
The ergonomics of better screen resolution don't change just because your OS isn't good at dealing with high resolution.
When you pick an operating system (if you have the choice), there are a lot of factors, it's good to know what's important to you and choose appropriately.
That was a direct reply to the “No matter what operating system you're on” quote. The fact that there are some operating systems that have issues doesn’t make it true that all do.
FWIW, my main monitor is a 43" 4k display, and it works perfectly fine on AwesomeWM - but I' don't use any scaling, the 4k is purely for more screen real estate - literally like having 4 perfectly aligned borderless 24" monitors. I can fit 10 full A4 pages of text simultaneously.
I recently upgraded to a 43” 4k monitor and use it the way you describe. I’m not sure I am happy with it. The real estate is nice but it might be too much. UI elements end up very far away. I rarely need all that space.
I either need a bigger (deeper) desk to sit back farther or just a smaller monitor physically with the same resolution.
What I found helped when I moved to a large 4k screen is when I stopped trying to eke out every last bit of space and started using my desktop as an actual desktop analogue again. Whereas I used to full screen, and snap to half or quarters of the screen, now I have a few core apps open that take up roughly a quarter, usually a browser and email, and other apps I open up and move around to organize as I feel the particular task warrants (generally some terminals that are all fully visible). I often drag the current thing I'm working on to the bottom half of. The screen so it's slightly closer and easier to see directly, and leave reference items or stuff I'm planning on revisiting shortly up top.
I also was thinking I didn't particularly like the large desktop and screen at first, but now that I treat it as a combined wallboard and desk space, I can't imagine going back (and using the large but not quite 4k monitor at my desk at the office always felt like a step backwards).
I do set default scaling for Firefox and Thunderbird to be about 125% of normal though, as I don't like squinting at small text. I generally like how small all the other OS widgets are though, so I don't scale the whole desktop.
Thanks for the input. I’m about a week in and that is the realization I am reaching. My window sizes are almost back to where they were before.
It has been an iterative process but I’m getting a handle on it.
I have a sit/stand desk, I’m considering moving my recliner in to the office and elevating the monitor. Working from a reclined position seems ideal but I also thought a 43” monitor was a good idea...
If you're on Windows, consider the fancyzones power toy. Break up your screens into an arbitrary set of maximizable zones. Highly recommend if the normal 4 corners isn't enough for you.
I'm on MacOS. I basically never maximize anything. I just resize the floating windows and put them where they make sense at the time. The dream would be doing this with the keyboard using something like i3. I have been tinkering with that on weekends but M-F I focus on paying work.
Check out Moom. You can save window layouts and replay, save window positions and apply them to any app, arbitrarily move, maximize to a range with a customizable margin, etc etc.
https://manytricks.com/moom/
Depending on how exact of a similarity you're looking for, gTile for GNOME/Cinnamon might be of interest to you. I've also found PaperWM to be very productive.
I use gTile for gnome. I had to play around to find a setup that I like. I eventually settled on ignoring most of the features that were offered out of box. Now I have configured a few simple keyboard shortcuts.
For example, Super + Up Arrow will move a window into the central third section of the display. Pressing it again will expand the window a little bit.
Nice and simple. Makes working with large displays pleasant.
I use Rectangle. I’ve configured it with a few keyboard shortcuts that let me move a window into specific regions on the display. I use it to quickly have multiple non-overlapping windows.
I cannot imagine using a large display without it!
BetterTouchTool is _excellent_ for window placement/resizing; also can be triggered externally if you want to combine it with Alfred for Karabiner Elements (though you don't have to, you can define the triggers in BTT itself).
The 4K monitor at my office is a fair bit smaller than that, and I haven't used it since the COVID-19 epidemic sent me packing home. Since then I've just been using my laptop's built-in screen.
To be honest, I think I may stick to it. At first, the huge monitor was fun, and initial change to having less screen real estate was definitely a drag. But, now that I'm accustomed to it again, I'm finding that "I can fit less stuff on the screen at once" is just another way of saying, "it's harder to distract myself with extra stuff on the screen." My productivity is possibly up, and certainly no worse.
A major pain (literally) point for me with laptop screens is posture. My neck aches after a day of looking mostly down. I suppose an external keyboard and mouse would help but I would have to get a stand and blah blah.
Also for my particular workload real estate is very handy. I totally agree with there being some virtue to constraints but several times a day I really need the space.
I think this really depends on the work you do, also. Pure development or content creation and I'm good with just a laptop. For research, with team communication, concurrent terminal sessions, debugging, management - I really do want at least 3 screens.
This. After doing some research, as far as I can tell a 32" 4K maximizes the amount of content you can see at one time within a comfortable viewing angle and without needing scaling to make text readable.
At typical desk monitor distances you shouldn't be able to see distinct pixels anyway.
Agreed. My work monitor is a 32 or 34" ultrawide. It works well but I would really like more vertical real estate. I'm definitely shopping for 32-34" 4k displays right now.
Looks nice also, but I already have a 4k 32" Eizo Flexscan that I'm happy with - I'm after the 4:3 aspect ratio, everything just seems to go wider and wider these days.
Vertical is the reason I haven't upgraded from dual 1920x1200 (but of the time single anyway). Although I'm looking at 1440p mainly for 120hz+. (1440p at 30+")
43" seems rather large--how far away do you sit? If it were as close as a more "normal" sized monitor (~2-3 feet), wouldn't you be craning your neck all day trying to see different parts of the screen?
Nah. I have a 49" curved 1440p monitor. Things you look at less often go to the sides. You can fit 4 reasonable sized windows side by side. Code editor holds over 100 columns at a comfortable font size for me 40 year old eyes. It's the best monitor setup I have ever had. You can spend less and get the exact same real estate with two 27" 1440p monitors. Either way, it is a fine amount of real estate and not at all cumbersome for all day use in my case.
I am getting the same Dell 4919DW monitor, transitioning from two 25" Dell monitors. I think the built in KVM will be great addition as I have two workstations. Ordered the new Dell 7750 to pair with a WD19DC docking station. I hope the Intel 630 UHD built in graphics will do, as stated in the knowledge base.
The 4919DW only have 60hz refresh rate but I am not concerned about that. A great alternative would be the curved Samsung 49" C49RG9 at 120hz.
It’s quite a piece of hardware. It lets me plug a USB hub into it. It supports USB-C for its display adapter. So it’s the dream: one USB-C cable to charge the laptop, drive the display, and provide a USB hub for keyboard, mouse, etc.
I'm in the same boat. More real estate is the big win. I made a pandemic purchase of a TCL 43" 4k TV to use as a monitor primarily for programming. I sit a bit further from it: 30" rather than 24ish when working on the laptop. I drive it with a 2019 inexpensive Acer laptop running Ubuntu 20.04 and xfce. Every so often an update kills xWindows, but I can start it in safe mode and get things working.
I do find my head is on a swivel comparatively, but while noticeable without being a negative. Overall I like it. A lot. The only thing that is painful is sharing the desktop over Webex/skype. That does bog the system down and requires manual resizing of font size to inflate it so that viewers on lower resolution systems can cope with it.
I am somewhere in between. I don't go for hi-DPI but am using 28" 4K on the desktop and 14" 1080p on my laptops. So identical dot pitch and scaling settings. I just have more display area for more windows, exactly as you say like a 2x2 seamless array of screens.
I actually evolved my office setup from dual 24" 1920x1200 and went to dual 28" 4K. But with the COVID lockdown, I only have one of the same spec monitor at home for several months, and realize that I barely miss the second monitor. I was probably only using 1.25 monitors in practice as the real estate is vast.
People who complain that a monitor is too large should stop opening a single window full-screen and discover what it is like to have a windowing system...
In the same boat here. I use a $400 49 inch curved 4k TV as my monitor along with i3wm and while I waste a lot of the space on screen to perpetually open apps I don't touch, having the ability to look at my todo list or every app I need for a project at the same time has its benefits. I just wish I could lower the height and tilt the TV upwards a bit so I'm not breaking my neck looking at the upper windows.
I had a similar setup at a previous job -- one of the early 39" TVs. It could only drive 4k at 30Hz, but for staring at text, nothing could beat it. It takes a good tiling window manager to get the most out of this setup. By the same token, a good tiling WM also makes a tiny little netbook screen feel much bigger. So I guess what I'm really saying is, use a tiling WM!
43" 4k is approximately 100dpi, like 21" 2k. It seems like a reasonable form factor to me (at 1x), but there aren't many of them that do high refresh rate, and they're all very expensive.
I've only seriously tested the Dell P4317Q that I have in the office. Others have had good success with small 4k TVs. Can't say I've noticed anything about the latency, but I've never gamed or watched movies on it, so IDK
I use Samsung 4k TV's (55in and 43in) at work and home and the experience is absolutely fantastic. In game mode the latency is reported to be 11ms and there's no difference visible to me compared to 60hz computer monitors.
Do you have the specific model numbers? I'm buying a new monitor soon, and considering using a 4K TV. Would be great to check out the ones you're using!
If you are on macOS, all is good. Never had a problem with any of my 4 monitors (3x4K, 1x5K). I set the scaling to a size I like, and the text is super crisp. I don't see how any programmer can NOT like that.
How do you manage multiple monitors with MacOS? I was doing this until recently and every single login involved rearranging my windows because MacOS moves them all to whatever display woke up first.
In my experience MacOS multi-monitor support is effectively non-existent.
Recently I picked up a 49” ultra-ultra wide monitor (basically 2x27” panels). It is one monitor but MacOS can’t drive it. They just don’t detect that resolution. I switched to a 43” 4k monitor (technically more pixels) and MacOS drives it fine.
My experience with MacOS is not “it just works” unless you are doing something Apple already predicted. That’s fine for me, I just wish they still sold a reasonable monitor themselves so I could be assured it would work properly.
How do you manage multiple monitors with MacOS? I was doing this until recently and every single login involved rearranging my windows because MacOS moves them all to whatever display woke up first.
I finally got sick of this and wrote a Hammerspoon script to deal with this. The config looks like this:
> every single login involved rearranging my windows because MacOS moves them all to whatever display woke up first
Maybe we can get to the bottom of this. What is your use case?
I ask because as long as I plug them into the same ports it remembers how I arranged them previously (2018 macbook pro 15"). I haven't had to arrange them in over a year... even remembered when updating to latest operating system. Occasionally, I even plug in my LCD TV as a third external monitor and it remembers where that one should go in the arrangement too.
MacOS cannot drive one 5120x1440 display using Intel display hardware. It will happily drive two displays at 2560x1440. The monitor had multiple inputs so by putting it in PBP mode I was able to drive one input as USB-C and another as HDMI through a dock converter. This means the wakeup was not in sync. MacOS would see one monitor, arrange everything on that then realize there was a second one and fail to move anything back in this "new" arrangement.
The fact that it was all one physical monitor may have further confused the OS as a sibling comment mentions.
The solution was to sell the monitor to a Windows-using architect friend and buy a different panel with a resolution MacOS supports. She has a macbook too but it's the fancy one with discrete graphics which can drive 5120x1440.
The value proposition of MacOS to me is that I plug things in and they work. Any fiddling beyond that destroys the benefits of using this platform. I'm willing to iterate on hardware until I find something that works.
I do not have a 2020 MacBook so I cannot test but the Pro Display XDR is not 5120x1440, it is 6016x3384. The problem with my current MacBooks ('14 15" RMBP and '17 13" MBP, both with Intel Iris graphics) is that while they can drive 4k displays they cannot drive the 5120x1440 resolution specifically.
This limitation is specific to the MacOS drivers. Windows in Bootcamp is able to drive 5120x1440 on these devices.
Yeah I read through all those. It’s a work laptop so I’m not comfortable doing things like disabling SIP or mucking around in any system settings. That machine is my livelihood so I don’t mind finding devices that just work.
Ah ok, ya maybe it's related to it being the same monitor.
I have two different monitors that wake up at very different speeds and it's no problem here. My 15" 2013 and 2015 macbook pros had no problem with this either, and I've had 4 different monitors in the mix through those years too. I've transitioned to a CalDigit Thunderbolt 3 dock now and still no problem with it remembering.
So there's definitely something unique about that monitor. That is sad news for me too -- I'm hoping they make a 2x4K ultra wide monitor like that someday. Hopefully they've solved this problem by then.
That might work but it breaks my workflow in another way. Physically the display is a single panel. I organize workspaces by task so changing to a new one needs to change "both" panels because I'm actually using them as one.
>How do you manage multiple monitors with MacOS? I was doing this until recently and every single login involved rearranging my windows because MacOS moves them all to whatever display woke up first.
At least for apps that are dedicated to one screen + virtual desktop, right click its icon in the dock and assign it to that display and workspace.
Note that the effectiveness of window restoration also depends on the make/model of your monitors – many manufacturers incorrectly share EDID's across all units of the same model and sometimes across multiple models, making it much more difficult for operating systems to uniquely identify them.
That used to happen occasionally to me as well in earlier macOS versions. Didn't have to do any rearranging since Mojave, I think, definitely not on Catalina.
I use a single 34" 4K monitor with an arm mount on my Mac Mini. The power button on this monitor is one of those touch sensitive ones on the bottom right that I sometimes accidentally brush past. When I switch it back on, every single window gets reduced into an incoherent size and everything gets moved to the top left. It's really annoying.
I'm thinking of flipping my monitor upside down so I'll never accidentally brush that area while picking up something on the table.
That's likely a firmware bug in the monitor. It probably reports some ridiculously small resolution during its boot process and macOS queries it that time and rearranges the windows accordingly.
macOS could implement workarounds of course, but probably it just follows the process whatever the display id protocol prescribes...
What kind of MacBook do you have exactly? Year, size, graphics hardware and OS.
Reports I read stated that while you can select it with SwitchResX it was scaled.
I never tried installing it myself because I’m not a fan of modifying the system on a Mac, especially one I don’t own.
From my poking around I think the horizontal resolution is the problem. The system scans possible resolutions to see what works. Apple just never expected a single display that wide.
There’s some reports that newer MacBooks with discrete graphics on Catalina can indeed run this resolution. It used to not work regardless of hardware, now apparently discrete graphics MacBooks can run it. Maybe because they updated the drivers/system for their new super fancy monitors.
Go to the Displays panel, switch to the "default for display" option, then switch back to "scaled" while holding down the option key. Do you see that resolution in the list of options?
I’d say all is bad with MacOS and external monitors... It can’t manage text scaling like Windows, so you either have to downscale resolution and get everything blurry or keep the ridiculously high native resolution and have everything tiny :(
Is it not visible for you in the displays settings? You DO need all the monitors to have the same DPI or you’d have a window rendered half in one dpi and half in another when dragging across a display boundary.
No, when it’s an external screen I don’t get any scaling options, only the choice of resolution. I have a 24” QHD, so either it’s ridiculously small 2500xSomething or it’s blurry HD :(
I have this problem as well. I actually run my 27-inch 4K screens downscaled on MacOS because the tiny font at native-4K gives me a headache.
The worst thing about it is that scaling seems to use more CPU than running natively and the OS has some noticeably additional latency when running scaled.
Odd, my main setup is an external 4k monitor and I only use it with the “large text” text scaling and I have no complaints, the text is clear and large and easy to read. Perhaps you’re also using your laptop screen as well?
At work I have a mac mini and a Windows box, and I use three crapola Asus monitors between them, and my impression has been that macOS does a better job rendering text on said crapola monitors (the Windows box does a better job at compiling C++ in a timely fashion, though, so I mostly work on that one).
It's just a different stylistic choice. A lot of font nerds prefer the OSX choices because they try to stay true to the original font spacing without regard to the pixel grid.
Missing sub-pixel antialiasing is plain technical deficiency, not a stylistic choice. I agree arguments can be had about hinting and aligning the glyphs to the pixel grid, but not much beyond that.
Yeah they didn't completely remove it, but they did a good job of hiding it by not making it an option to turn on in the GUI. Have to use a terminal command to enable it:
https://apple.stackexchange.com/a/337871
In general with a HiDPI screen I don't find any need for it. But on a low-res display like the typical 24" 1080P models it certainly helps.
Completely agree! Went from a mediocre 2x1440p to high quality 2 x 4K, then back to a pair of equal quality 2x1440p.
I would also add, when it comes to 4K and, for example, MacBooks, things fall apart quickly in my opinion. Cables, adapters/dongles/docking stations just must match up for everything to work in proper 60fps, and it gets worse if you have two external displays.
As for my home set up, also stayed at 25" 1440p. Nice balance for work, hobby and occasional gaming without braking the bank for a top-tier GPU.
>I would also add, when it comes to 4K and, for example, MacBooks, things fall apart quickly in my opinion. Cables, adapters/dongles/docking stations just must match up for everything to work in proper 60fps, and it gets worse if you have two external displays.
I agree it's a bit of a mess, but USB-C monitors solve all those issues. I just plug my MacBook in with USB-C, and instantly my 4K (60 Hz) display is connected, along with external sound and any USB peripherals. No fussing with a million different cables and adapters. It's the docking workstation setup I've dreamed of for a decade.
It doesn’t solve all of the issues. The USB-C port can support DisplayPort 1.2 or 1.4 bandwidths and you have to make sure it matches up for some high-resolution monitors to work.
Why not both? If I'm on linux, with no interest in changing and perfectly happy with my display, and 4k doesn't work easily on my system, why would I be interested in a 4k screen?
Strange. I'm not seeing any issues with Linux and 4k. I'm running a plain Debian 10 with OpenBox running on 4x 4k (3x 28" in a row and one 11" 4k under the right-most) monitors though, granted, I only normally have one web browser that follows me around, across work spaces pinned to the right monitor, mostly maximized Sublime on the middle monitor and a pile pile of alacritty/xterm windows on the left-most monitor. The small monitor which content also follows me around contains clipboard, clocks, Slack and monitoring.
What is the software that people are using that creates problems?
So far, I've never had an issue with KDE Plasma and 4K@60Hz on linux, once I realized that you can't just use any old HDMI cable: you need DisplayPort or HDMI2
FWIW, switching between resolutions in my favorite desktop environment, Xfce, is two steps:
# This affects every GTK app.
xfconf-query -c xsettings -p /Xft/DPI -s 144
The second step is going to about:config in Firefox, and setting layout.css.devPixelsPerPx to a higher value than 1.0. I really need to write an extension to do that in one click.
What is really tricky, though, it's having two monitors with different DPI. Win 10 does an acceptable job with it; no Linux tools I'm aware of can handle it reasonably well. Some xrandr incantations can offer partial solutions.
Even Win10 struggles when you move windows between different DPI domains. Apps will slide in HUGE or tiny until you get past the midway point. And when the system goes to sleep everything can go to hell. You can come back to small message windows being blown up to huge sizes or windows crushed down to a tiny square. You can forget about laying out your icons perfectly on your desktop too, they'll get rearranged all the time. Even more fun when you remote into a high DPI display with a low DPI display. It actually works pretty well, but stuff will get shrunk or blown up randomly when you to back to the high DPI display.
Logically having your OS maintain a consistent UI size makes sense until you try it without.
I'm running a couple medium high density monitors alongside one of the highest density ones available. I don't scale the HIDPI monitor at all, which means when I drag windows to it they are tiny. Instead it works in two ways, as a status screen for activity monitors/etc and as a text/document editing screen. AKA putting adobe acrobat, firefox or sublime/emacs/etc on the high DPI screen and then zooming in gives all the font smoothing/etc advantages of high DPI without needing OS support.
So the TLDR is, turn off dpi scaling, and leave the hidpi screen as a dedicated text editor/etc with the font sizes bumped to a comfortable size. Bonus here is that the additional effort of clicking the menu/etc will encourage learning keyboard shortcuts.
I don't think the reasons you illustrated support that conclusion. You don't actively dislike the extra pixel density of a 4K display. You seem to only dislike the compatibility issues relevant to your use case.
>No matter what operating system you're on, you'll eventually run into an application that doesn't render in high dpi mode.
FWIW, I can't recall the last time I has a problem with apps not rendering correctly in hidpi mode on MacOS. Unless you've got a very specific legacy app that you rely on for regular use it's a non-issue.
>Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming
Ah, I think I found the real issue ;-) If your linux desktop rendered 4K beautifully, seamlessly, and without any scaling issues right out of the box, I could all but guarantee that your opinion would be different.
You know, I was in complete agreement with the article and I was considering a 1440p monitor or something until I saw your comment and reflected on it. The most productive periods/jobs I can remember were on i3wm with a Goodwill 900p/19" monitor and a 20" iMac 10 years old at the time. But it's because I had access to good tools then like Neovim/Atom respectively. My work now requires an RDP Notepad.exe session so there's no monitor that will help me there. I guess software tools are way more important.
I have a 49" curved monitor. It is effectively two 27" 1440p monitors stapled together (5120x1440). It is the best monitor I have ever had. 1440p has a very decent [higher than typical] pixel density but is not "retina". Fonts look pretty smooth, but you can still see pixels if you try really hard. Overall, I do think high density screens look amazing, but the software has not quite caught up to them. The benefits are on the softer side, and if I could just have magical mega-high-DPI displays with no side effects, sure why not? As it stands, 49" curved monitor is pretty fine. It fits four windows side by side at reasonable resolutions.
Primary apps go in the middle, such as code editor, etc.. Tertiary windows, such as documentation go on the outer edges. Still quite usable, but a little out of the way for extended reading.
Hey, do you mind sharing more info. on how to get the monitor? I'm looking to invest in a curved one since it's an experience I've never had. And are there retina models out there, or is it not worth it, in your view?
I couldn’t agree more with this. A 49” 5120x1440 curved monitor is brilliant for productivity. It’s better than two or three separate monitors. I do miss high DPI but I wouldn’t trade this type of monitor for the current batch of smaller high DPI ones.
There’s only two or three things that would make this better. A high DPI variant, more vertical space and a greater refresh rate. Given those two things, I think that’s the endgame for monitors (in a productivity context).
(I think that’s 8x the bandwidth so it’s a while away!)
I have a 4k 27" monitor and I have had to run it in 1440p recently because its all my dell xps can manage and its usable but a noticeable downgrade. My 24" 1440p monitor at work looks perfectly fine though.
> 3. Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming. I use i3 and it adds way more productivity to my workflow than "My fonts are almost imperceptively sharper" ever could
What issues actually? Xmonad on Arch user here and I find the sweet spot for me is 27-32" 4k, 1440p on laptop (I guess 4k would be nice here too, but not sure if it motivates the increased power draw). After getting used to the increased real-estate, I do feel limited on my older 1080p laptop screen; fonts smaller than 8p (which is still all good) noticably impacts the readability to the point I can feel my eyes strain faster. It did take a bit of playing around with Dpi settings to get it right, though, out of the box it's not great. The Arch wiki has some great material.
The only frustration I do have (which IS super-frustrating, specifically for web browsing) is with multi-monitor setups with different pixel density - your point 2, I guess. Even plugging anything larger than 19" with 1080p into my 1080p Thinkpad is annoying.
I think it should be possible to configure it correctly but I just gave up and end up zooming in/out whenever I do this and send windows between screens. Haven't looked at it, but maybe a mature DE like KDE or GNOME (which, if you don't know, you can still use with i3) should be able to take care of this.
Also, this is all on X11, have no idea if and how wayland differs.
My day job involves untangling SQL that was written under fire. I consume more spaghetti than a pre-covid Olive Garden. Every vertical pixel is precious for grokking what some sub query is doing in the context of the full statement.
I used to when I ran dual 24" monitors! We had really sweet old IBM monitor stands that could tilt, raise and rotate. You had to be quick to grab them from the copy room before the electronics pickup. I swiped one from the still warm desk of a colleague on the way to their farewell happy hour.
So I had a '14 13"(? Might have been 15" but probably not) RMBP on the left with email and chat stuff, 24" main monitor in landscape and then another 24" monitor in portrait on the right. I think at some point I put a newer panel on the IBM mount. It was sweet.
Back then we still had desktop PCs at our actual desks so there was a KVM on the main monitor! What a time to be alive!
These days I do prefer a single monitor workflow if possible. It's just cleaner and move convenient.
Being used to MacBooks with retina screens, 4K on at 30” is perfect to me as “retina”. Anything larger needs to be 5k or 6k. 1440p is passable on <24”.
27" x 1440p has been my go-to for a while now. Works well without scaling between win/mac/linux, does not dominate the desk completely, high quality monitors are readily available in this resolution etc etc.
I think my dream set up would be a 27" 1440P (What I currently have at home, with a pair of smaller (19" maybe?) 1080p screens on either side setup in portrait. Basically a similar screen area to 2x27", but without a bezel right in the center of my field of view, and the 1080x1920 screens will be a good size for displaying full page (e.g. PDF) at more or less full screen.
I actually like having multiple screens. I know I'm weird in this but I actually like running certain apps maximized.. but I wouldn't want it maximized across a whole ultrawide.
Plus, if I take a working vacation somewhere it's a lot more practical to schlep around one 24" or 27" than an ultrawide.
Also, just as an ergonomic thing, I could angle in the two outer screens a bit while not having one of those icky curved screens.
>My setup is 2x24" 1920x1200 monitors - so I get slightly more vertical pixels than true 1080p, but in the form of screen real estate rather than improved density.
I'm working on an old 24"16/10 display (the venerable ProLite B2403WS) and an OK 32" 4K display with a VA panel. Both are properly calibrated.
There is no amount of tinkering that can make fonts on the 24" look good. It looks like dog shit in comparison to the 4K screen. It might not be obvious when all you got in front of your eyes is the 24" display, but it's blatant side to side.
On top of it, the real life vertical real estate of the 4K display is also quite larger.
I've never been a big 16/9 fan, but frankly at the size monitors come in today and the market prices, I don't a reason not to pick a few of these for developing.
> 1. No matter what operating system you're on, you'll eventually run into an application that doesn't render in high dpi mode.
I haven't had this experience (MacOS, 4K monitor for 2.5 years)
> 2. If the 4k screen is on your laptop, good luck ever having a decent experience plugging in a 1080p monitor. Also good luck having anyone's random spare monitor be 4k.
shrug - 4K and 1080p seem to work together just fine for me. I've currently got a 27" 4K monitor and a 24" 1080p monitor both running off my 2015 13" MacBook Pro; the 4K is on DisplayPort (60Hz @ 3840x2160) and the 1080p is on HDMI (and it happens to be in portrait mode). I use all three screens (including the laptop's), and while the 1080p is noticeably crappier than the other two, it's still usable, and the combination of all three together works well for me. A couple of extra tools (e.g. BetterTouchTool) really help with throwing things between monitors, resizing them to take up some particular chunk of the screen, etc. - my setup's quite keyboard-heavy with emphasis on making full use of the space inspired by years of running i3 (and before that xmonad, ratpoison and others) on linux and freebsd.
> 3. Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming.
That's a statement about linux and i3, not monitors. (And again, I like i3, but stating this limitation as if it's a problem with monitors not i3 seems... odd.)
> > 3. Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming.
> That's a statement about linux and i3, not monitors. (And again, I like i3, but stating this limitation as if it's a problem with monitors not i3 seems... odd.)
It is also wrong. I am a long time i3 user. Never had a problem with it, never done anything special. Most of the time I'm running Debian stable, so I even use software versions that most people consider 'old'.
> 1. No matter what operating system you're on, you'll eventually run into an application that doesn't render in high dpi mode. Depending on the OS that can mean it renders tiny, or that the whole things is super ugly and pixelated (WAY worse than on a native 1080p display)
Never happened to me in 4 years, see below. That said, I barely use any graphical programs besides kitty, firefox, thunderbird and spotify.
> 3. Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming. I use i3 and it adds way more productivity to my workflow than "My fonts are almost imperceptively sharper" ever could
This is just not true. I have used the same 32" 4k monitor for 4 years running NixOS with bspwm (a tiling window manager, which does even less than i3) on 3 different laptops - thinkpad x230 (at 30 Hz), x260 and x395 and it all worked completely fine.
It depends on a very simple tool I wrote, because I was sick with `xrandr`: https://github.com/rvolosatovs/gorandr , but `xrandr` could easily be used as alternative.
Recently I switched to Sway on Wayland and it could not be smoother - everything just works with no scripting, including hot-plug.
> I genuinely think 4k provides no real benefit to me as a developer unless the screen is 27" or higher, because increased pixel density just isn't required. If more pixels meant slightly higher density but also came with more usable screen real estate, that'd be what made the difference for me.
Indeed, screen size is way more important than resolution. In fact, even 4k at 27" seemed too small for me when I had to use that in the office - I would either have to deal with super small font sizes and straining my eyes or sacrificing screen space by zooming in.
> 2. If the 4k screen is on your laptop, good luck ever having a decent experience plugging in a 1080p monitor. Also good luck having anyone's random spare monitor be 4k.
I have been running two 1440p displays on a 4K retina MBP and the experience has been impressively seamless. Both with Catalina (the latest) and High Sierra
The biggest problem is with Apple port splitters; they are crap and sometimes monitors wake from sleep with a garbled picture
So I'm a little nuts in that I run 2 x 27" 4K monitors side by side with no scaling. 27" is about the smallest I can tolerate 1:1 pixel sizes.
Since aging has forced me into wearing reading glasses, I wear single vision computer glasses that are optimized for the distance range of my monitors' closest and furthest points.
Because I dont have scaling enabled, I don't get any of the HiDpi issues that I've gotten on my laptops with Windows.
I have found that I am still wanting for even more screen real estate, and for a time I had a pair of ultrawide 23" monitors underneath my main monitors, but it created more problems than it solved and I recently went back to only two monitors.
That's an interesting idea. I should look into that when I eventually upgrade. A stubborn part of me left the monitors in landscape because I occasionally play games, but I end up never doing that on my desktop.
Because I prefer to use the same model of monitor when doing a grid, so I don't think I want to add to my existing setup, because my monitors are discontinued, and they're displayport only for 4K60.
I think they have more than a few years of life left in them, but I'll definitely look into a configuration like yours at upgrade time.
> Good luck ever having a decent experience plugging in a 1080p monitor.
A 4k monitor is now $300 (new). Used are even cheaper.
> Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming. I use i3 and it adds way more productivity to my workflow
I use Gnome 3.36 and many HiDPI issues I was having before are now gone, without any extra configuration.
> My argument in favor of 1080p is that I find text to just be... completely readable.
It is readable but fonts are pixelated, unlike 4k.
My only problem is that macOS has some artificial limitations when it comes to using non-Apple monitors. Like a lower refresh rate. My solution? Use Linux.
$300 is a lot of money where I'm from. And they aren't available at that price here, anyway.
What do you mean, fonts are pixelated at 1080p? Whether you can see the pixels probably depends on pixel size. I certainly can't see them on my 23" LG monitor unless I try really hard.
> No matter what operating system you're on, you'll eventually run into an application that doesn't render in high dpi mode.
I am using using Emacs and tmux under Linux, in the Gnome desktop. Gnome has HDPI scaling. For me, that works fine with a 4K 43 inch display. The thing you need to watch out for is to get a graphics card with proper open source (FOSS) driver support. Some cards are crap and don't come with FOSS drivers. You can get them to run but it is a PITA on every kernel update. Don't do that to yourself, get a decent card.
I miss i3 so much. But I've succumbed to laziness and have been using my various macbooks. Agree that it's a huge productivity gain, moreso than any font improvements.
Another linux+i3 user here, I've not tried 4k yet but you confirmed my suspicions.
I did a lot of research before buying an xps-13 and went with the 1080p version due to basically all the reasons you just stated + poor battery life and video performance.
I have hope for the future though... what would really make transitioning easier is a way to automatically upscale incompatible programs, even if it means nearest neighbor scaling at least it will make them usable on super hi-dpi monitors.
I have 4K monitor on laptop and also as external monitor but I have no problem with linux (using debian testing with gnome 3). I can easily combine it with 1080p monitors. Everything works out of the box.
I still switched to 1440p as 4K is just better looking 1080p. You cannot fit more information on the screen with scaling and without scaling everything is too small. I work as backend developer so space is more important for me than visual quality.
> 3. Configuring my preferred linux environment to work with 4k is either impossible or just super time consuming. I use i3 and it adds way more productivity to my workflow than "My fonts are almost imperceptively sharper" ever could
Happy i3 arch linux 4k monitor user here for over 2 years. I only set an appropriate Xft.dpi for my monitor size/resolution in ~/.Xresources once and that was it.
I'm not going to argue your preferences, but then why don't you get one 50 inch 4k display? That's about four of your current displays at similar density on a single cable. And probably at a similar price point, too.
Or, if you are using decent graphics hardware, you could even get two of them and have four times more display space than you have now.
> No matter what operating system you're on, you'll eventually run into an application that doesn't render in high dpi mode.
It might depend on the program, too. Some might only work in pixels. Fortunately, it is usually not a problem if you are trying to run a program designed for Gameboy; the emulator should be able to scale it automatically, subject to the user setting. I don't know if any X server has a setting to magnify mouse cursor shapes, but it seems like it should be possible to implement in the X server. Also, it seems like SDL 1.x has no environment variable to magnify the image. My own program Free Hero Mesh (which I have not worked on in a while, because I am working on other stuff) allows icons to be of any size up to 255x255 pixels (the puzzle set can contain icons of any square size up to 255x255, and may have multiple sizes; it will try to find the best size based on the user setting, using integer scaling to grow them if necessary), but currently is limited to a single built-in 8x8 font for text. If someone ports it to a library that does allow zooming, then that might help, too. However, it is not really designed for high DPI displays, and it might not be changed unless someone with a high DPI display wants to use it and modifies the program to support a user option for scaling text too (and possibly also scaling icons to a bigger size than 255x255) (then I might merge their changes, possibly).
Still, I don't need 4K. The size I have is fine, but unfortunately too many things use big text; some web pages zoom text by viewport size, which I hate, and a bigger monitor would then just make it worse.
> some web pages zoom text by viewport size, which I hate, and a bigger monitor would then just make it worse.
Not that it excuses bad UX, but you might consider keeping your browser window at something below full width. I find this more comfortable anyway.
Total aside: I've noticed Windows and Linux users tend to keep their windows fully maximized, whereas Mac users don't. Doesn't apply to everyone of course, but enough to be noticed. This was true even before Apple changed the behavior of the green Zoom button, and I've always wondered why.
I find Spaces/Expose/Mission Control (or whatever they call it these days) way more comfortable than dealing with Windows. I especially like that if I hide a window, it doesn't pop up when using Mission Control. Opening the Windows equivalent shows me every window, even stuff I minimized/hid. It feels cluttered.
I don't see how alt-tabbing through maximized windows on macOS is different from Windows and Linux like the OP is suggesting. Though I do keep my browser at half-width on my ultrawide monitor because it's somewhat of an exotic/untested aspect ratio for websites.
Also any power user that cares will use a tool like Divvy on macOS for arranging windows with hotkeys.
> I've noticed Windows and Linux users tend to keep their windows fully maximized
Interesting, I've noticed the exact opposite. Mac devs, especially younger ones, tend to have full-screen IDEs and browsers and constantly flick back and forth between apps. My theory was always that Windows and Linux users had gotten comfortable with the desktop metaphor while a large percentage of newer Mac users grew up using iPads which were all full-screen, all the time.
Quick note I perhaps should have clarified, I wasn't thinking about the Mac's "full screen mode". This was something I noticed about other students in my high school a decade ago (why it's coming to mind now, I have no idea), before full screen mode existed on Mac.
It used to be that if you clicked the green button on Mac, most apps (not all apps, for weird aqua-UI reasons, but certainly web browsers) would grow to fill the screen without outright hiding the menu bar and dock, just like the maximize button on Windows.
My experience pre-full screen on macs was that the green button would do just about any random thing except make the window fill the screen. It would certainly change, usually it would fill vertically (but not always) but almost never horizontally.
To this day I still rarely press that button because of years of it doing nothing but unpredictable nonsense.
Technically, that's true, but "Wayland is inherently better at security/DPI scaling/other" is one of those cultural myths that eventually come true because of the people who believe in it. It would be possible to add these improvements to the X server, but no one wants to maintain or improve the X server anymore. All the developer effort is behind Wayland. So to get those benefits, you have to use Wayland.
Im on Gnome and use fractal scaling . 2x and everything got too big. But 1.6 looks OK. Its actually not on the app layer, its the screen that is scaled up. Although some low level programs can have issues with mouse pointer position if they dont take into account the scaling.
i3 user on a 4K screen here, has worked fine since 2014 for me (with the exception of the random super old TCL/Tk app) ? https://i.imgur.com/b8jVooO.png
Since nobody else has mentioned it, if you like i3 you should give Sway a test drive. Wayland still has some rough edges (screen sharing, for example) but it supports high DPI monitors with fractional scaling almost out-of-the-box.
> I genuinely think 4k provides no real benefit to me
Could you clarify when in your life you've had a 4k screen with good OS support, so that we know what experience you are speaking from when you say that?
One can definitely still see the pixels in a 4K 24'' monitor. That is not the point.
But I do agree with points 1 and 2 (they tend to work better on windows, though).
On the other hand, what about 3? I would find it ridiculous that it'll take you more than 5 seconds to enlarge DPI (no multi-monitor) even on the weirdest of X11 WMs. X11 is designed for this....
I agree with all your points, however I’ve found the mac is extremely variable DPI friendly. I think the games with custom UIs (Europa Universalis IV comes to mind) are the only things that haven’t adapted and it’s hardly a problem if you set the scaling to “large text” or whatever, just a little pixelated like you would see on a 1080p screen.
I feel like HN as a whole want blogs to come back, and part of that is well written exposition (the technical details) followed by near-outlandish opinions that tend to generate discussion. It's more fun, it leads to more conversation (here we are!), and clarifies our own values.
I think three monitors is distracting, but that whole community discussion gave me the opportunity to compare how I use a multi-monitor setup compared to others.
I generally dislike how fonts are rendered at floating-point scaling or lower resolutions, but I too just learned to live with it over years of ignoring it. Unfortunately, now I can't unsee it!
Could you elaborate on this? Specifically, do you feel that having three/multiple monitors is necessarily distracting (no matter what you do with them) or just encourages distraction?
My experience is the latter, and that if I am disciplined and only put windows from one task at a time on all three, then I have no more temptation to be distracted than I usually have at my internet-enabled glowbox, but maybe I'm an outlier.
For me, I can't use three monitors. It just takes too much brain-cycles to process and remember "where that window is".
I can process two fine: one "the workspace" the other "the reference/product/outcome/tests".
I tried really hard to teach myself structure, but with three monitors, I find myself always loosing windows, losing the cursor etc. Hacks like "find my cursor" (not default on Linux) help, but I'd rather have a clean mental map of where my cursor, focus and windows are, at all times.
What stuck best was having the third monitor hold the API/documentation/reference but still, the mental power to keep all my whereabouts mapped in my head were just too much.
Also note that I went from one to three monitors, so this is not "just too used to two monitors to ever change" it was the other way around.
Ubuntu has nice tiling features without requiring tiling for everything ([meta]-[→] and [meta]-[←]) that allow my now-ingrained use of two monitors on my laptop-screen when I'm somewhere without a second monitor. Another thing that I disliked about my three-monitor-setup: it does not map in any way to a single monitor: using virtual desktops worked, somewhat but still too different.
My experience is that my attention span sucks enough that there's not a functional difference between encouraging distraction and being necessarily distracting.
I do basically all my work on my laptop without an external display. I use two displays for some specific tasks where I need to look at two things at once.
As for me, three monitors is just too much information. There's no thing I do that requires me to see so many things at once. I do make extensive use of virtual desktops instead.
I like one large 4k monitor right in front of me, and the window manager is set to allow splitting windows in thirds. Laptop off to the side with music, terminal, and other auxiliary stuff.
>I feel like HN as a whole want blogs to come back
Blogs never left— they may have withered to near-nothingness after the same exact people who claim to want them abandoned them, but they’re still here.
I may be wrong but I believe the people who claim to want blogs abandoned them for sites like Hacker News.
HN is a link aggregator that gives a lot of blogs more exposure than they would otherwise get. I would say something like Twitter (micro-blogging and social networking) has contributed to decline of blogs.
If you are going to blame anyone, it's easiest to blame Google's failed EEE attempt on the blogging world with sunset of Reader and attempt to push everyone to Google+. There were so many blogs and meta-blogs I saw directly lost in that fumbled transition. In trying to build their own Facebook, Google did irreparable harm to the blogging world in the process.
"Relatively unchanged" is such an interesting POV. It's been in such a stasis that I pretty much assume it is as dead as LiveJournal. Maybe not in literally the same way that LiveJournal got shuffled around in a shell game to some strange Russian owners, but in a very metaphorical sense.
At one point Blogger in the early oughts was ahead of the pack in leading mainstream acceptance and usage of blogs. At one point my feed list was over half Blogger sites, but today I can think of only a few Blogger-hosted blogs left at all in my feed list, none of them have been updated recently, and those that have updated recently were claimed by spammers and (sadly) dropped from my list.
I can't imagine there's much more than a skeleton crew at Blogger keeping the lights on, and I would be unsurprised, if in pushing the metaphor to the LiveJournal thing, to learn that they were being kept in the Bay Area equivalent of Siberia by some half-mad state actors that need Blogger's zombie to keep its current undead behavior in some strange psy ops nightmare.
I love my 3 monitor setup. In 20 years of coding, I finally reached peak productivity in my workspace. No switching between windows, ease of throwing a window aside for monitoring, laying out information and tools side to side. It’s incredible once you readjust your habits and learn to use that new space. I compare it to having a large, wide desk when you’re working on something. Who wouldn’t want that?
I was one of the people who worked on 15” retina MBP everywhere, even in the office where I had 30” on my desk, to not have to readjust and keep optimal habits for the screen size. Now I simply refuse to work on a laptop at all, it feels like being trapped into a tiny box and I get literally claustrophobic :)
> I compare it to having a large, wide desk when you’re working on something. Who wouldn’t want that?
I have used between 1-3 monitors over the last decade, and there sure are advantages to having 3 for certain tasks. However, I noticed that having multiple monitors resulted in me having a dedicated screen for email (usually the smallest, my laptop screen). This decreased my productivity.
Perhaps not everyone has this weak spot, but for me using multiple monitors has a downside from an attention/focus perspective.
Coronavirus has robbed me of one of my favorite productivity hacks, which is coding on old Thinkpad with 4:3 ratio display, at a coffeeshop or library with Internet turned off. No distractions, no multitasking, just pure focus on a problem.
My friend, when I moved into my new apartment I went without WiFi for 2 years and did nothing but code on my T43. In that time I managed to rewrite (a variant of) Age of Empires 2.
IMHO, two monitors is an amazing upgrade. One screen for code, another for reference material or for the app being debugged. Better than one huge screen in many cases, as it’s two 16:10 spaces that you can maximize things to.
But with a 3rd monitor, you’re well into diminishing returns, it may even end up being a distraction if it becomes a Slack/Mail/web screen.
> I noticed that having multiple monitors resulted in me having a dedicated screen for email (usually the smallest, my laptop screen)
I'm currently using two external monitors, with my laptop docked and closed. I find the "dedicated screen for <distraction>" was a problem for me when I had my laptop screen open, because it's a different size/resolution/position than my actual monitors. On the other hand, I never have that problem with my dedicated monitors - in my mind they're a part of the same "workspace" because they're the same size, resolution, and positioned together - so I could see myself going to 3 desktop monitors one day.
I have two monitors right now and wish I had a third. One for code, one for documentation, and one for running whatever I'm working on (website, android emulator, etc). Currently I have the code monitor vertical and swap between workspaces on the horizontal monitor for the running thing and documentation.
I've solved this problem by having my email client (actually it's Slack in my case, but the same principle) and terminal share a screen. This works pretty well because I rarely want to use my terminal and chat at the same time.
I understand this and I don't want to argue against anyone's preferences. You know what's best for you.
I'm just pushing back against these out of touch fads. Most developers worldwide don't have a three-monitor setup. There is no proven correlation between quality software and 4K displays or mechanical keyboards (to name other fads). More importantly, the best devs I've known -- people I admire -- used tiny laptops with tiny displays, and shrugged when offered even a single external monitor; it just wasn't a big deal for them.
Mechanical keyboards don't make you a better coder.
But the only thing that will is writing lots of code -- over years and decades. And about 12 years ago, I started running into this anti-feature of human physiology known as "aging". And whereas in my think-I'm-so-l33t 20s I could bang out code on crappy desktop and laptop keyboards, by my 30s they were turning my hands into gnarled claws.
The remedy for this, for me, was a keyboard with Cherry MX switches. The crisp feedback let me know when a stroke was registered, so I unconsciously pressed each key less hard and was able to type faster with less pain.
Yeah, I wouldn't say having a mechanical keyboard makes your code any higher quality - that'd be pretty silly.
I think in general the thought is, if you care enough about your craft that you seek out refined tools, that care will be reflected in higher-quality development. Whether that's true or not, I don't know, but I'm inclined to believe there's a correlation.
I mean, it would be weird to visit a professional carpenter's house and see Harbor Freight tools, right?
Thanks for the reply. I think there is little to no correlation, but like the opposite opinion, I've no proof other than the anecdotal: the best hackers I've known didn't care about these things.
Other bizarre opinions I've read from Atwood and his followers: that you should be an excellent typist (this is also related to owning a mechanical keyboard). No. Just no. Typing speed is not the bottleneck when writing software. The bottleneck is my brain. I've never seen a project fail because people typed too slowly.
I do think there's a "CrossFit" mentality among the typer-coders who swear by mechanicals and end up with wrist braces - a kind of "more is more" approach that drives them to write lots of code, put in lots of hours, memorize innumerable details, and min-max their output in Taylorist fashion. It's optimizing for reps, versus mobility, stability, flexibility.
I have let my WPM drop a fair bit over time. I'm still relatively young yet, but I see no reason to go fast when I realize that most of the typing amounts to disposable bullshit. It's better to spend time thinking and developing thought patterns, and then just type a little bit to jog your mind and clarify. I allow myself to write some cheap code, but the point of that is to sketch, and the sketch should be light and quick, for the same reason that artists will say to favor long, confident strokes instead of chicken-scratch markings.
My main gripe here is that as time has gone on, and I've racked up the RSI's, is that the brain-to-text latency has gone up notably.
This scares the shit out of me. I'm not in the older subset of programmers (<30 atm), and this has gotten to the point where the latency actually affects my workflow.
> Typing speed is not the bottleneck when writing software. The bottleneck is my brain.
I agree except with a caveat: the mechanical action of typing, formatting, refactoring, fixing typos and missing semicolons, and moving code around actually distracts the brain from the higher level task at hand. And when the brain is already the bottleneck, I don't want to make it worse by wasting brain cycles on mechanical drudgery.
As one might expect, I feel far more productive when I'm using languages and tools that require me to type less and refactor & re-edit code less. I think the language would matter less if I could just wish code onto the screen. Until then, learning to touch type (with as few errors as possible! not necessarily as fast as possible) and use editor features to make it more effortless is the next best thing.
It’s the opposite for me, having fewer monitors makes it harder to find a window. I have to go through alt-tabbing slowly to get to the one I’m after.
With three monitors, I know exactly where my windows are. If I have more than three windows that’s annoying, but I keep the extra ones on the middle monitor to simplify things.
I think the Python ethos applies directly to typing speed: "code is more often read than written".
I agree, if speed of your typing is your bottleneck in getting code written, perhaps you should be coding smarter not harder.
I think there is some wisdom that you should try to be a "good" typist, in that better typing skills reduce the risk of injury (RSI), but that's self-care/ergonomics, and while still very important, there are plenty of good software developers that hunt-and-pecked their way to an early retirement (and/or multiple carpal tunnel surgeries).
I've had a phase of getting mechanical keyboards, but I always found myself typing slower on them. The added travel time, even on the "low profile" mech keyboards was making me type slower. I am back to scissor switch and I couldn't be happier. Although I prefer the low profile keyboards in general. One of my favourite keyboards is the butterfly Macbook keyboard, but I know it has mixed opinions.
Typing speed is not the bottleneck when writing software. The bottleneck is my brain.
I see typing like the ability to do mental arithmetic: being able to do it well isn't the thing that's directly valuable, but it removes distraction from the activity that is valuable, and that ends up making it valuable as well.
Another way to look at it: the faster you think, the faster you need to type in order for it not to become the bottleneck (during the times where you're actually implementing an algorithm as opposed to designing it). Of course, that's not just a function of raw typing skill, but also of the tools you use and the verbosity of your programming language. (An amusing corollary of this is that for society at large, it's useful to have great hackers who are bad typists: they will be tempted to develop better tools from which the good typists can also benefit!)
I've never known a great developer who did hunt-and-peck typing though. I do know great developers who have their own typing system. They simply never bothered to learn the "proper" way to do ten finger typing, and that's fine (unless those typing systems are worse for RSI, which was the case for me personally).
I understand what you're saying, but that's simply not my experience (either with myself or observing others).
Note Atwood claims you must be an excellent typist, training yourself to become one. I find this fetishization of a mechanical skill bizarre. I'm not advocating clumsily struggling with the keyboard like an elderly person, but past the point of "I'm a decent typist", I find that's enough.
I find there's no correlation between the problem-solving ability needed to be a great software developer and being a fast typist of the sort Atwood et al advocate.
I file this under "weird things some programmers believe without evidence" ;)
Yeah, I think we agree on the excellent typist point. It needs to be fast enough, but I suspect what happens is that pretty much everybody using a computer sufficiently to become a great developer reaches "fast enough" naturally through implicit practice.
I agree, but would say that in real life most of the people I know who like mechanical keyboards like them for hand strain reasons. They find them more comfortable to work with. While the code written is the same on both, that things like 4k monitors (eye strain) and mechanical keyboards (hand strain) are better for the long term health of the programmer. I've not gotten on the 4K train, but do like having my keyboard for that reason.
Multiple monitors though is purely personal preference I think. While having the documentation on another screen is something I personally find useful, if anything it probably makes me lazier about trying to remember things.
I think it's valuable to be able to type effortlessly without having to think about it too hard. Typing is a distraction that takes brain power away from the important things.
Typing speed is probably not the bottleneck but I found that since I started touch typing I get a lot less headaches because I don't have to look up and down all the time.
Unlike carpentry, the quality of our tools (keyboard and monitor, specifically) don't affect the quality of our output.
I think in many cases, people hide the fact that they're not competent with high-cost professional tools, because laymen use it as a proxy for talent that they cannot evaluate.
I think that's also why many exceptional programmers just use a 5-year old laptop -- they don't need to compensate.
A day-trader having 12 monitors mounted on the wall doesn't make him profitable.
I still use my 11 year old Thinkpad together with my newer Thinkpad. I use both laptops alternately. My productivity does not increase when i use the newer one.
That's said my extra 20 inch monitor helps me to visualize.
I've seen plenty of professionals using Harbor Freight tools, usually not carpenters but the tile saws and wrenches seem popular for professional use.
The correlation seems more likely to me that if you can afford the fancy tools then you've already had some level of success. Though there are those new mechanics who bury themselves in a mountain of debt buying a whole chest full of Snap-On stuff...
A professional will eventually wear even the best tools. And since they use those tools every day, they can keep an eye on wear. So it's not that crazy for them to use relatively cheap stuff.
Plus, a tradesman will also sometimes lose tools, drop them in places they can't recover them from and so on.
On the other hand, the day I want to fix some issue at home, the last thing I want is the tool I use perhaps once a year to be an additional source of issue, because it involves a round trip to the store.
In the last few years they've addressed that. Their Chicago Electric tools should generally be avoided but the Vulcan line of welders and Bauer/Hercules hand tools are all perfectly serviceable for light/occasional use.
The issue with heavy use is not that they don't work but that they're heavier, less ergonomic, and less robust/repairable than the name brands; if you can afford the name brands and will be using the tool until it breaks, fixing it, and then using it more then you'll want to go with the name brands.
20 years ago when GeForce 2 MX appeared I switched to 2 monitors; in the next 6 months the software department (that was the name) switched to 2 monitors by contagion; I was in the infrastructure department, they just saw the benefits. Since then, I never worked with less than 2 monitors. I can use productively 3 if I have it, otherwise (and most of the time) I use 2.
I am not a good developer, it is not my job, but I started coding on a Commodore 64 in text mode, then I did Cobol and FoxPro for DOS on 80x25 screen with no problem. But when larger monitors appeared, I used it, when the possibility to use more than one monitor appeared, I used it. It is a case of technology helping you, not making you better but helping - I am more productive using 2 monitors than limiting to just one. Because of this, I use the laptop (1366x768 screen) only as a portable email tool, everything else is on a pair of 24" monitors, in the office or at home. Sometimes I pull a monitor from another desk (in the office) or other computer (at home) when I do specific work that benefits of 3 monitors, but it is not a matter of preference, just specific use cases where 3 is better than 2.
May favourite dev environment was a 7" Android 4 running Debian. I got plenty done with an external keyboard.
I bang away at my mba 13" 2013 these days and the only real gripe i have is the lack of delete & backspace keys combo: I've never gotten comfortable without it.
That said, the only reason i could possibly use more screen real estate is web debugging. But to me that's more of an indictment of the environment I'm "coding" in.
The only time ever needed two monitors was back writing 3d games on a 3dfx (before nvidia head hunted their engineers) and needed to debug something while running full screen.
While i understand this argumentation, to me, monitors, their size & their number have always been pretty much...meh. instead it's the quality of the monitor itself (refresh rates, contrast, brightness)
> ... the only real gripe i have is the lack of delete & backspace keys combo
I’m not sure I follow. Are you complaining about the lack of a dedicated Delete key on Macs, having to use only one key for both Backspace (Delete key) and Delete (Fn + Delete keys)?
That personally bothers me a lot, as does the lack of home and end keys. Yes I know there are key chords to accomplish the same function, but having a dedicated key as part of your workflow makes a big difference. Maybe if I worked on my Mac 100% of the time I wouldn’t mind, but I only use a Mac about 20% of the time and it is incredibly infuriating.
Agree with other commenter who said you know what's best for you. Good job on iterating toward an optimal setup!
But I will tell you why multi monitor setups aren't the best for me. It doesn't feel like having a nice big desk to work at; rather, it's like having a separate desk for each monitor, and I have to move from one to the other to use it. With more than one centered monitor, I have to move my head to look between them, or my entire body, so that I can face straight towards whichever monitor I'm currently looking at. I've tried going back to multi monitor setups many times and every time I get tired of it faster due to straining my neck, eyes, elbows and shoulders with all that turning-to.
For me, it's one very nice monitor, with my laptop plugged in in clamshell mode (although now I leave my rMBP cracked so I can use TouchID).
I've also been using a window manager (Moom) with hotkeys to be able to set up three vertical windows on my screen. That seems to be the sweet spot for me: I can have multiple different code editors, or editor+terminal+web, or throw in email/slack/whatever into the mix. (I can also split a vertical column to two windows to achieve a 2 row x 3 column layout, and lots of other layouts, 1x1 vert/hor, 2x2, centered small/large...) I feel like I've arrived where you're at, my perfect setup!
I also still enjoy the 13" rMBP screen, although I can't get to 3 columns, and lately the keyboard hurts my wrists after extended usage. I use a Kinesis Freestyle 2 with the monitor+rMBP which has been absolutely fantastic for typing ergonomics.
I don't want that. There was a time when I used multiple monitors, but I've found that just working on a laptop works better for me. It's less distracting, and I find switching between windows to be both faster and less disorienting than turning my head to look at another monitor.
I can definitely understand other people preferring multiple monitors, but not everyone has the same preferences.
I appreciate your favour. I am probably an exception, but i like to code on just a single 15 inch MacBook. I switch screens by pressing key combinations. I believe faster, but also more convenient than moving your head around constantly.
for me everything must be accessible by various key combos. once I have that working I hardly need to use the trackpad mouse anymore.
the truth is that most people in my team's work with dual or triple screens.
I have always used a single monitor and I am productive like crazy. I mainly need a code editor and terminal. Sometimes switch to a browser and back. And that's enough. More monitors doesn't automatically imply more productivity IMO. Maybe in some specific cases. You can't focus on all monitors simultaneously anyway.
Have you used a 4K display, for at least a few days? If you have, I still disagree with you, but if not, I’m going to completely ignore your opinion, because I find the difference in how pleasant it is to use a good screen just so vast. Sure, you can do things on a lousy monitor, but it’s terrible and you’ll hate it. :)
(My first laptop was a second-hand HP 6710b at 1680×1050 for 15″, and that set me to never accepting 1366×768. So my next laptop was 15″ 1920×1080, and now I use a 13″ (though I’d rather have had 15″) 3000×2000 Surface Book, and it’s great. Not everyone will be able to justify the expense of 4K or similar (though I do think that anyone that’s getting their living by it should very strongly consider it worthwhile), but I honestly believe that it would be better if laptop makers all agreed to manufacture no more 15″ laptops with 1366×768 displays, and take 1920×1080 as a minimum acceptable quality. As it is, some people understandably want a cheap laptop and although the 1920×1080 panel is not much dearer than the 1366×768 panel, you commonly just can’t buy properly cheap laptops with 1920×1080 panels.)
I'll go one step further: I used the LG 27" 5K Display for two whole years before returning to a 34" Ultrawide with a more typical DPI.
Obviously I preferred the pixel density and image quality of the high-DPI screen, but I find myself more productive on the 34" Ultrawide with a regular DPI. (FWIW, LG now has a newer pseudo-5K Ultrawide that strikes a balance between the two).
I look forward to the day that monitors of all sizes are available with high DPI, but I don't consider it a must-have upgrade just yet.
Also Note that Apple made font rendering worse starting in OS X 10.14 by disabling subpixel AA. Using a regular DPI monitor on Windows or Linux is a better experience than using the same monitor on OS X right now. If you're only comparing monitors based on OSX, you're not getting the full story.
I also have 3440x1440 on my Dell monitor at home, and I love it.
My work monitor is a really nice 27" 4k LG monitor, which a coworker picked out. He's a real monitor specs nerd and made a lot of assertions like the OP. The scaling issues are endless and really bother me, and I don't notice the higher PPI at all. I much prefer the ultrawide Dell - it gives me a feeling that I don't even need to maximize my windows and I can still have lots of space.
It takes up zero desk space (one monitor arm that lets me adjust it anytime I want), and I don't need to rotate it. I've never found that a useful thing to do.
On the other hand, when you're done work its amazing for flight simulator or other games that support the aspect ratio properly.
Not parent poster but, as somebody who works on macOS, my problem is that the switch between the high-density laptop screen and a “pixelated” external one is extremely jarring. At one point I had an additional, average Dell monitor and every time I moved my eyes I cringed. After a couple of days I just removed it - better to have fewer screens than forcing my eyes to readjust every few minutes.
So yeah, if your primary device is a modern Mac, you really want a high-res, high-density screen.
Yes, Apple never completely embraced the somewhat hacky techniques of font rendering (hinting and subpixel AA), thus their fonts always appeared less crisp.
Not really.
There is an end point.
Also, resolution is not free, this is a tradeoff, when you push more pixels on screen it means more work for the GPU (or CPU in some cases) and more loading time and space to load all those high-dpi resources...
At this point I clearly prefer lower latency and higher framerate over more pixels.
Sure, to a point. But the step-up in question here (1080p to 4K, at sizes like 15–27″) has clearly visible benefits.
And sure, resolution isn’t free, but at these levels that was an argument for the hardware of eight years ago, not the hardware of today. All modern hardware can cope with at least one 4K display with perfect equanimity. Excluding games (which you can continue to run at the lower resolutions if necessary), almost no software will be measurably affected in latency or frame rate by being bumped from 1080p to 4K. Graphics memory requirements will be increased to as much as 4×, but that’s typically not a problem.
That's true, undeniably. But I think the better tradeoff still is to go up in size: IMO the optimal monitor size is 38". Big enough, but not too much head turning. Would I get a sharper 38" if possible? Sure. But I wouldn't compromise on size to gain higher DPI.
I have a laptop with a 13" 3200x1800 monitor; not quite 4k, but higher resolution than my eyes' ability to discern fine detail. My other laptop is 1366x768. The other laptop is a lot better for a wide variety of reasons (and also an order of magnitude more expensive) but the display resolution is genuinely something I don't give a crap about. It isn't terrible and I don't hate it. There's plenty of stuff I don't like about the display; narrow viewing angle, poor contrast, prone to glare- but the resolution is fine.
Having used CRT monitors, 1920x1080 displays, 4K displays and 5K displays, as well as various Retina Macbooks over many years, mostly for coding, here's my opinion:
The only good solution today is the iMac 5K. Yes, 5K makes all the difference — it lets me comfortably fit three columns of code instead of two in my full-screen Emacs, and that's a huge improvement.
4K monitors are usable, but annoying, the scaling is just never right and fonts are blurry.
Built-in retina screens on macbooks are great, but they are small. And also, only two columns of code, not three.
One thing I noticed is that as I a) become older, b) work on progressively more complex software, I do need to hold more information on my screen(s). Those three columns of code? I often wish for four: ClojureScript code on the frontend, API event processing, domain code, database code. Being older does matter, too, because short-term memory becomes worse and it's better to have things on screen at the same time rather than switch contexts. I'm having hopes for 6K and 8K monitors, once they cost less than an arm and a leg.
So no, I don't think you can develop using "tiny laptops with poor 1366x768 displays". At least not all kinds of software, and not everyone can.
> So no, I don't think you can develop using "tiny laptops with poor 1366x768 displays". At least not all kinds of software, and not everyone can.
This opinion seems bizarre to me. You start by offering personal (and valid) anecdote, then end up saying "I don't think you can develop [...]". But this flies in the face of evidence. Most people by far do not use your preferred monitor setup (iMac 5K) and in my country a vast number of developers use 1366x768 to develop all sorts of high quality software.
It's one thing to say "as I grow older, I find I prefer $SETUP". No-one can argue with that, it's your opinion (and it might very well become mine as I grow... um, older than I already am!). It's an entirely different thing to claim, as you do here and I think TFA does in similar terms, "you cannot prefer lower tech setups", "you cannot develop software this way", "it's very difficult to develop software without $SETUP". The latter is demonstrably false! I've seen it done, again and again, by people who were masters at their craft.
I don't think they were doubting that somebody does develop in those random setups, they were disagreeing with the people that say it doesn't matter and you can code anywhere. In your quote, a royal you.
But they are not random setups. They are extremely common setups in my part of the world. People -- who are pretty good at what they do -- can and do develop using these tiny screens. In this regard, "it doesn't matter". Or taking less literally, they wouldn't complain if they got a better monitor, but it's not the primary concern for them. So taking a cue from TFA's title: "no, it's not time to upgrade your monitor".
Following your logic, I can't claim anything, because there is always someone, somewhere, who will come up with a contrary opinion.
I do respect your opinion, but I still hold to mine. I also think this discussion won't lead anywhere, because we are glossing over the terms. Not every "developer" is the same, not every "software" is of the same complexity. I can fix CSS in a single 80x25 terminal, I can't do that when thinking about changes in an ERP system.
Note I do not dispute your opinion. You're entitled to it and you know what works for you.
Regrettably, following (my) logic does mean you cannot say "you [the generic you] cannot develop like this", because this is indeed easily disproven. People can and do (and sometimes even prefer to). That's the problem with generalizing from your personal opinion ("I don't like this") to the general ("people cannot like this").
Ya, the iMac 20" 5K is deal. I got an LG 28" 4K just to interface with a MBP..it was a painful compromise but stand alone 5K monitors are just too expensive right now.
When we are talking pixel count, we have to talk about size of the display also. a 28" 4k is acceptable, a 40"+ 4K is best used as a TV.
The best display I use right now is a 11" iPad Pro with a refresh rate at 120 Hz. You really can feel it, especially for stylus work.
Atwood is a bit of a prima-donna and part of being a blogger is a to make big statements.
One of my roles for a long time was speccing/procuring computer equipment (later overseeing the same) for a huge diverse organization. I took feedback, complaints from users and vendors, etc. People are passionate about this stuff... I was physically threatened once, and had people bake cookies and cakes multiple times as a thank you for different things.
The only monitor complaints I recall getting in quantity were: brightness, I need 2, and i need a tilt/swivel mount. Never heard about resolution, etc. Print and graphic artists would ask for color calibrated displays. Nobody ever asked for 3, and when we started phasing in 1920x1080 displays, we literally had zero upgrade requests from the older panels.
You dont need a good display until you have used one. Its like glasses, you think you got perfect vision and then you get glasses its night and day in comparison. You dont know a good display until you have seen one. The thing with high res monitors is that you should upscale or everything will look tiny.
IMO, there are two key criteria for monitors. Real estate and pixel density, and in some cases, you can't get both affordably.
I have had 15" laptops with 4K displays for some time now. I love the pixel density. But I can't do certain types of tasks on them because I end up getting real estate anxiety. I feel so constricted on a laptop screen, even when I add a second monitor.
My desktop has 2 x 27" 4K monitors running without scaling. So I have plenty of real estate, but the text could look nicer. Having said that, I don't miss sharp text in the same way that I miss real estate when using my laptops, at least from a productivity perspective.
I don't think a 5K screen is an answer for me, because my first urge would be to try to use it without scaling for more real estate.
On the other hand, a pair of 27" 8K screens (does such a beast even exist, and is it affordable?) would be ideal, because 1:1 scaling on such a beast is impossible at that monitor size, but 200% scaling would basically give me the same workspace real estate that I have now but with super sharp text.
A long time ago, resolutions got better, but when we got more pixels...we didn't make the characters smaller, we just made them more crisp. What changed? Why is the transition from 110 PPI to 220 PPI being handled differently than the one from 50 PPI to 110 PPI?
> What changed? Why is the transition from 110 PPI to 220 PPI being handled differently than the one from 50 PPI to 110 PPI?
Well that is a user setting, and I tend to think the first retina iPhone is what started the trend.
Also, at certain screen sizes, 1:1 scaling is just impractical due to limitations of the human eye.
My 27" 4K screen is pretty much at my personal limit in terms of what I can handle at 1:1 scaling for my normal viewing distance. Had I gotten a 24" 4K screen, I'd have to set the text scaling to ~125% which would also translate into some lost screen real estate.
> Also, at certain screen sizes, 1:1 scaling is just impractical due to limitations of the human eye.
Scaling isn't even relevant. It makes sense to talk about scaling pixel artifacts, but not vectors one. The font is 8pt or 20pt, you can configure that on your own. 1:1 scaling is just some arbitrary and artificial fixed pixel/letter seen.
A 24" 4K screen (basically a 24" imac) had text the same size as a pre-retina 24" imac, they just used more pixels to render that text.
Windows 8+ does, windows 10 has no problem with resolution independence. Some legacy apps that run on windows break, but not something modern
like Visual studio, chrome, etc...
Windows has issues with HiDPI once you have multiple screens with different densities, and it's not just with legacy apps.
In any case, Windows is just using a scaling feature. But when you scale, you are losing workspace in exchange for sharpness.
When 4 physical pixels translates to 1 logical pixel, you've actually lost 3 pixels of potential real estate on a screen size where your eyes can actually discern those pixels.
On a 15" 4K monitor, losing that real estate is not a big deal, because >90% of people can't easily read or see anything scaled at 1:1 on a screen that size. When you scale that screen at 200%, you're basically using a 1080p screen with sharper text. You don't gain any real estate at all from the higher resolution.
On a 40" 4K screen, it's a whole other story. The text may not be sharp, but you can have way more windows open on that screen, which makes it easier to multitask. It's like having a grid of 2x2 20" 1080p screens.
-addendum - my visual limit is 1080p @ 1:1 on a ~14" screen, which is why I am fine with no scaling on my 4K 27" screens.
If you use an OS that handles high DPI very well, and where 98% of all apps handle it just as well (including resolution scaling), it is an absolute joy to my eyes to be able to use 4K at the same effective resolution (so basically 1080p, but double the resolution, double the sharpness).
Every time I use a 'non-retina' type of display (like an old laptop I use for testing or my 19" 1080p monitor), it feels like I'm looking through some dirty glass because of the blotchy effect.
I tried 4K on one linux environment (and documented the experience[1]), and according to numerous responses, my situation was not unique: if you try 4K on any Linux environment, and don't enjoy everything being tiny, then it's not a fun time trying to get everything to behave like you can with Apple's built in resolution scaling options.
I certainly agree that some like them, while others don't.
And some grow to like them - I used to scoff... before I tried it for myself.
I think as well, it depends on your eyesight, and also your workflows and apps. I use Rider (and previously Visual Studio), and these have a lot of panes/windows - it's really nice to be able to have the code front and centre, with debug output, logs and a command prompt in another screen, for example. Another example would be to have zoom/webex in one screen, while I'm taking markdown notes in another.
A good while back, I moved to a dual monitor setup at home (1x 3k in landscape, 1x 1200p in portrait), and a triple monitor setup at the office (1x 1080p in landscape, 2x 1200p in portrait). I also use a Dell screen manager, so I can easily setup regions that windows can snap to - for example, the portrait displays are usually split in 2 horizontally.
The triple monitor setup is admittedly gratuitous, but I'm never going back from a dual setup - it's just so convenient to have everything I need always there, instead of constantly flipping back and forth through the many windows I invariably have open. It feels like I'm context switching much less.
No problem. Bear in mind I'm from Latin America, so we're usually several years behind the US (i.e. older tech, sold at excessive prices).
My work laptop is a Dell Latitude 7480. It's display resolution is a 1366x768. I normally use it with an external monitor, because the screen is not only small but also low quality. This... fine... piece of hardware was bought my current employer I think 2 years ago.
In all workplaces I've seen, either you get a macbook (with their high quality display) or an entry level laptop from Dell/HP/Lenovo (or some no-name brand) with a low quality 1366x768 display, which is what passes for entry level where I live. In many companies, macbooks are usually given to people who can explain why they need them, or to reward good performance. However, newer startups seem to default to macbooks.
Put yourself in the mindset of a typography geek. Then, by default, you will care about almost all of these things. I'm not saying you should care about all of these things all or most of the time, but that's the correct mindset to put you in sync with the author's conclusions (mostly -- I don't really think 120Hz is that big a deal).
120 Hz is a huge deal. It makes every mouse move better, and that alone is worth it. It reduces the pain of using poorly designed software that adds extra frames of latency to everything. It makes it easier to track and read moving/scrolling content, and it reduces judder when playing low frame rate content (e.g. 24 Hz Hollywood movies).
I'm surprised the author doesn't mention backlight strobing, which is another huge upgrade. Kind of hard to describe to people who have never seen it though, as is the case with most of these improvements.
The second and third monitor recommended in the article have 4ms latency, so there's essentially no reason to buy them.
There are good 4k 120hz monitors with strobing for way less than the $2000 mentioned. E.g. the zisworks kit for Samsung U28H750 (first available already in 2017): http://www.zisworks.com/shop.html
IMO 120 Hz monitors are overrated for pretty much everything except for FPS games (and similar fast-paced interactive titles).
I have a 144 Hz display and I would not want to go back to a 60 Hz display for gaming. At some point, my monitor changed to a 60 Hz refresh rate without telling me after some driver update. The first time I played Overwatch after that happened I could immediately tell that something was wrong, since the game just didn't feel as responsive and smooth as it usually does.
Another OW player in HN? :)
I feel the same about the 144hz monitor I have, only really worth it for playing OW, the rest of the time is nice but I feel like I'm not getting enough out of it.
Maybe next monitor I get will be 4k, but not sure for now.
After starting with 144 Hz monitor the difference to 60 Hz already with desktop mouse movements is so clear that it's immediately obvious if the refresh rate changed.
I get where he's coming from, but then again, that's not me. The cheapest monitor he recommends costs as much as my entire setup would cost in its current, used, state. I'm not gonna buy a single monitor for that price - and if i would, i wouldn't downscale it to the resolution [edit: real estate] i had before!
I've worked on a range of setups from dual screens, 120hz to little CRTs and laptop displays, and even a bit of graphing calculator and smartphone coding. My conclusion is that the difference depends mostly on the workflow, and what the workflow changes is mostly how much information you're scanning.
If you're scanning tons of text, all the time, across multiple windows, you need a lot of real estate, and in times of yore you would turn to printouts and spread them over your desk. A lot of modern dev goes in this direction because you're constantly looking up docs and the syntax is shaped towards "vertical with occasional wide lines and nesting" - an inefficient use of the screen area. Or you have multiple terminals running and want to see a lot of output from each. Or you have a mix of a graphical app, text views and docs. A second monitor absolutely does boost productivity for all these scenarios since it gives you more spatial spread and reduces access to a small headturn or a glance.
If you're writing dense code, with few external dependencies, you can have a single screen up and see pretty much everything you need to see. Embedded dev, short scripts and algorithms are more along these lines. Coding with only TTS(e.g. total blindness) reduces the amount you can scan to the rate of speech, and I believe that consideration should be favored in the interest of an inclusive coding environment. But I'm digressing a bit.
For a more objective ranking of concerns: pixel density ranks lower than refresh rates, brightness and color reproduction in my book. If the screen is lowish res with a decent pixel font that presents cleanly at a scale comfortable to the eye, and there's no "jank" in the interaction, it's lower stress overall than having smooth characters rendered in a choppy way with TN panel inverted colors, CRT flicker, or inappropriate scaling.
Book quality typography is mostly interesting for generating a complete aesthetic, while when doing computing tasks you are mostly concerned about the symbolic clarity, which for English text is reasonably achieved at the 9x14 monospace of CP437, and comfortably so if you double that. There's a reason why developers have voted en-masse to avoid proportional fonts, and it's not because we are nostalgic for typewriters.
And yet for some reason we have ended up with tools that blithely ignore the use-case and apply a generic text rendering method that supports all kinds of styling, which of course makes it slow.
> There's a reason why developers have voted en-masse to avoid proportional fonts, and it's not because we are nostalgic for typewriters.
This is sad in itself. Proportional fonts are vastly superior to type writer fonts in presenting code. The kerning even makes for better readability even if it comes at the expense of the ability to do ASCII art.
I use a 55" curved 4k TV as my monitor. It's not pixel density that's important for reading text, but contrast. The best thing for coding is a high contrast bitmapped font with no smoothing.
My monitor is still available at Walmart for $500. It's like an actual desktop when you can spread out all your stuff.
> Suboptimal...No. Just no. The best developers I've known wrote code with tiny laptops with poor 1366x768 displays. They didn't think it was an impediment.
I've realized that a stable environment does wonders for productivity. What you got used to (stable) is path-dependent and personality-specific. But it doesn't matter. What matters is to train yourself to sit down and work. If you're opening that laptop lid and thinking about visiting Hacker News first, shut it down, get up, walk away from the desk and then retry. At this point, I feel like doing a "productivity, productivity" chant / rant a la Steve Ballmer's "developers, developers" chant:
In addition to what thomaslord said [ https://news.ycombinator.com/item?id=23554382 ], 4K monitors tend to be widescreen. 1280x1024 user here; you'll take my aspect ratio when you pry it from my cold, dead hands.
I also found that the 16:10 is a better ratio, than 16:9, when working with code.
I wish there would be more 1920x1200 or 2560x1600 displays...
Lacking those options at reasonable prices leaves me with the option of using rotated 2K monitors, though 27" ones are uncomfortably tall and 24-25" ones are a little too high in pixel density.
Huawei's Matebook's 3K (3000x2000) display with its 3:2 (1.5) ratio at almost 14" size is an amazing sweet spot though.
I wish they would have it in 28" size for desktop use :)
In the end I'm working on 27" 2K and 5K iMacs for about a decade now and I see it as a good compromise, despite the aspect ratio, but I quite like my 30" HP ZR30w with its 2560x1200 resolution. Unfortunately its backlight is getting old and uneven :(
Speaking of 3:2 ratio, we have to mention the Microsoft Surface line of tablets, laptops and convertibles.
Definitely overpriced for a developer machine, but there is also the Surface Studio 2, with its 4500 x 3000, 28 inch, 192 DPI display, 192 being exactly 2x the "standard" 96 DPI.
ah, and one solution to the font sharpness issue is to use bitmap fonts, indeed.
I haven't found good ones though which are available at 16px or above sizes, which are necessary for 2K or 4K displays.
Also they are typically quite wide and don't have narrow options.
I have a 1920x1200 at home, 16:10 instead of the usual 16:9. I feel the same way, I won't be upgrading that monitor until it dies. That little bit of extra vertical space is so welcome.
Nope, bad aspect ratio. Let me know if you find a 1920x1536, though. (1600x1280 would also be nice, and a 2560x2048 would be a 4K monitor that's actually useful, although thomaslord's objections still apply.)
Edit: actually, wouldn't 1440x1280 (aspect ratio 1.125, which is even less than my usual 1.25) be a better use of such a LCD panel, rather than a obviously-useless 1280x1440 (with a aspect ratio less than one)?
Yeah, pretty much. Also, standard video aspect ratios like 1280x720 fit the full width of the monitor with space above and below for toolbars and video player UI, which is... basically impossible to convey how frustrating it is to watch video on a widescreen monitor when you've tried a proper one for comparison. Any kind of fullscreen (ie maximized window) games or multimedia is similar.
I understand the bit about a video players, but you're not going to want to play a game in anything except exclusive fullscreen due to the massive input latency incurred when using windowed or borderless windowed.
Depends on the games you're playing, honestly. That "massive" latency is about 30ms from what I can find, which you're not really going to notice when playing a game like civilization or hearthstone.
We could still be programming at 640x480 like we did back in the early 90s. Thankfully, technology marched on and we went to 1024x768 and then even better resolutions. One would think that technology would march forward until we had ink on paper-resolution displays, because why not? But of course, we could still make do with 640x480 like we did way back when.
The bigger issue is that, like we've been doing for the last 40 years, a cutting edge 2X display should eventually become the new 1X display, while the former 1X's become obsolete. This makes building software a bit easier, even with resolution independence, it is difficult supporting more than 2 generations of display technologies (e.g. right now we have 100 PPI, 150 PPI, 220 PPI, even 300+ PPI at the bleeding edge).
I get what you're saying. In general, though, I disagree with the notion that newer tech makes old tech automatically -- always -- not good enough. It's the mentality that makes a lot of people rush to buy the latest Kindle (or mobile phone, or whatever) when the one they own does everything they need.
More importantly, it's one thing to say "I prefer this new tech/resolution/gadget" and another to claim "how can you work like this?" (Where "this" can be "without a mac", "without a mechanical keyboard", "without a 4K monitor", "without three 4K monitors", etc).
Software is developed successfully without any of those. It's not only not an impediment, it's not like a sort of martyrdom either. So no, it's not time to upgrade that monitor.
Someone else in this thread commented that for many devs, their gadgets are a proxy for actual skill. It's easier to show you have a 5K monitor "like all hackers should" than to actually be a good developer.
I just like 200+ PPI displays (24" 4K is the same pixel density as 27" 5K) because I have to spend a lot of time reading text and like not seeing pixels and/or anti-aliasing artifacts. Once you get used to it, going back is like going back from an iPhone 4 to some Windows Mobile 6.1 phone.
It should also be cheap enough now, and at some point fabbing a 200+ PPI display will cost the same as an older one, it won't make much sense to keep making the lower res displays anymore (like it doesn't make sense to fab 32MB DIMMs anymore).
Don't get me wrong, I agree when the tech gets better, with none or few of the downsides others mentioned, going forward is the only direction. I won't actively shop for an older tech once the current one gives up the ghost.
I just disagree that it's "time to upgrade your monitor", or that the current tech is bad or makes it hard to be productive.
After trying virtually every display setup imaginable over the years, I've settled on a single, large, high-resolution display as the optimal setup. While I have written a lot of code on small, low-resolution displays, there are definitely disadvantages to that.
I don't understand the value of multiple displays for code work. Just about everything that I could do with multiple displays can be more ergonomically achieved with multiple logical desktops. I can swap a desktop with a keystroke rather than pivoting, and that workflow translates well from laptops to large workstations.
I use an over the top amount of monitors to... monitor. The purpose is to be able to look at dynamically updating information without touching my mouse or keyboard. i have found myself making an omelette while looking at a large amount of displays in the distance ready to panic-leap into action.
When I code I notice a few things about the multimonitor setup. My cursor can travel 10 physical feet of screen space and it's literally inefficient to point. The ergonomics are terrible. The entire setup basically feeds inattention and an ever-sprawling amount of crap being opened. More than that I think having multiple large rectangles shooting light into your eyeballs kind of screws with your circadian rhythms.
When I code I use a laptop and enjoy how briefly my hands leave the keyboard before returning because what little pointing i'm doing can be done extremely quickly because of the small size of my workflow. Bigger is better is oversold - a workspace should match the work - too small is just more common than too big.
It's not meant as an argument, but as an illustration: people who I respect more than the author of TFA, and whose skill and accomplishments I've witnessed first hand, weren't eager to upgrade from tiny screens. It's not that they would've refused a better screen if given one, it's just that they didn't ask for it and it wasn't a pressing issue. Indeed, for them it was not "time to upgrade the monitor".
It taught me the valuable lesson that creating software, to them, was mostly about thinking, not about the gadgets you use to write it down.
Like another commenter said, it seems gadgets work as proxies for talent for some developers -- "you should use a mechanical keyboard and a 5K three-monitor setup, like real hackers do" -- because actual talent is harder to gain and demonstrate. A sort of cargo culting by accumulating tech gadgets.
Why are you talking about screen size when the article is talking about pixel density? You can have a lot of pixels but low pixel density because your screen is huge, while my first 200+ PPI display was on an iPhone 4, my watch has a 200+ PPI display as well. I don't think many would accept phones or watches with lower pixel densities these days, but somehow the display we use for work is different?
You are right that there is a cargo culting going on, but it is going on both sides. There are a lot of strawmen out there as well.
You are still conflating display size with pixel density, so I’m not sure what you are arguing against. You talked about in other post about how you had to scale your displays because your OS didn’t support resolution independence, is that what you mean by tiny?
I'm really baffled by your comment. I never said that about scaling or my OS not supporting resolution independence. I think you're confusing me with someone else, which might explain the disconnect between your posts and mine. (It's understandable because my initial post seems to have gathered a ton of responses, so it's easy to get confused about who said what. Probably my most successful (?) comment in all my years here.)
I have a hard time using three monitors effectively, so that in the end they are distracting for me. Probably you need some particular personality tics to make proper use of them.
1. Main screen. For the code editor, intense web browsing, etc.
2. Secondary screen. For debugging visuals (since I work on web stuff, it usually hosts a Chromium window; for a mobile dev, I imagine it would be an emulator/simulator), documentation referencing (with the code editor open on the main screen), etc.
3. Third screen. For all comms-related things: MS Teams/Outlook/Discord/etc.
I didn't mention terminal, because I prefer a quake-style sliding terminal. For a lot of devs, I imagine that having a terminal on their secondary screen permanently would work great as well.
P.S. Not that long ago, I realized that the physical positioning of monitors matters a lot (to me, at least) as well. I used to have 2 of them in landscape orientation side-by-side and one in portrait orientation to the side. It was fine, but didn't feel cohesive, and I definitely felt some unease. Finally got a tall mounting pole, and now I have the landscape oriented monitors one on top of each other instead of side-by-side (with the rest of the setup being the same). That was a noticeable improvement to me, as it felt like things finally clicked perfectly in my head.
To clarify, my beef is with Atwood's opinion (and similar opinions) that you must use a three monitor setup, otherwise you're doing something wrong. Of course I understand for many devs this setup works, in which case more power to them!
I just dislike being told unless I follow these fads I'm a subpar developer. I don't own a mechanical keyboard or a three monitor setup. I don't own a 4K monitor (I suppose I eventually will, when they become the norm). When Apple came up with retina displays, I didn't feel I had magically become unable to write code because my display at the time was 1440x900.
> my beef is with Atwood's opinion (and similar opinions) that you must use a three monitor setup
It's weird to me to specify the number of monitors given how they come in a vast range of shapes and sizes.
For example, my dream setup used to be a single 55" curved 8K monitor. That's the rough equivalent of a 2x2 grid of 27" 4K monitors (I currently have two 27" 4K side by side in landscape @ 1:1 scaling).
The only problem with my so-called dream set up though is I don't think my computer glasses, which are sharp only for surfaces between a range of 21" to 27" would allow me to see everything sharp from corner to corner on that monitor, which sucks.
as a general rule I like to have a computer that is sort the average crap that you think a person might have around who does not care that much about computers so then if the stuff I make works on that I know it's going to work on the upscale stuff as well.
Also then when one of my disastrous kids destroys it I don't feel bad.
I used to use this exact setup, but specifically eliminated monitor #3 as I felt it was counterproductive to have an "always on" comms monitor. These days my main monitor has one workspace, while my secondary has the normal secondary stuff in one workspace, and comms in another.
I found it to be less distracting and the two screens are more physically manageable, and easier to replicate if i change setting (cheaper too!). The only thing I will change is whether I'm in landscape/landscape or landscape/portrait. I can never make up my mind about what I prefer.
This is my exact layout too! Though the screen with the code editor is ultra wide, so with window tiling I have the editor and the terminal side by side
X230 at 1366x768 checking in, can confirm, no problem at all.
I'm primarily a C programmer. 1366x768 limits my functions to about 35 lines at 100 columns wide. My font is Terminus, sized at 14.
I thank my x230 for enforcing me into a paradigm where functions are short and to the point. I watch my co-workers flip their monitors vertically and write code that goes on for days. While it's not my style, I'm thankful I've been forced to make do of limited space, and it has trained me well.
My laptop has a 4K screen, but I switch to an external 27" QHD (2560x1440) display when I'm at my desk and boy ... I used to like that external monitor. Now all I can see, all I notice, is how much worse and more pixelated text looks on it than when I'm reading on the laptop's screen. It practically looks blurry!
So yeah, nobody needs a 4K display. But I would think most people don't know what they're missing. (:
I think lots of fonts actually look worse on hidpi screens, especially on Windows, probably because these fonts were specifically designed for low resolution screens. And most applications use hard-coded fonts that you can't change. Text is maybe a little easier to read on hidpi screens, but it is less aesthetically pleasing.
Linux can be made to look good in that respect, but the situation for scaling legacy applications is pretty bad.
Yes, this. Even if you have only one hidpi screen, Windows still looks bad, because the fonts look terrible when rendered on a hidpi screen. This is one thing keeping me on "lodpi" screens for now.
Acclimation is a huge factor: if you’re used to writing on a small laptop that will seem normal but few people won’t see a benefit moving to a larger display even if they don’t expect it. That’s one of the few durable research findings over the decades.
Multiple monitors are slightly different: the physical gap means it’s not a seamless switch and not every task benefits.
I agree, and not. I like having many windows side-by-side (code, docs, thing I'm developing), but I also like having more vertical space. I keep one of my monitors vertical for coding, although that's a little too tall so there's some lost space at the top and bottom (9:12 would be better for me).
Really I want a single plus-shaped monitor that can act in several display modes:
- mimic 3 monitors. Left and right are 4:3 (ideal for single application), center is 9:12 and larger. Shape: -|-
- Single vertical monitor. Same as above but with left and right "virtual monitors" turned off for reduced eye strain. Shape: |
- Single ultrawide horizontal monitor (connect the left and right parts with the strip in the middle, turn off unused pixels at the top and bottom). Shape: ---
Yup, this kind of stuff is very much based on preference. I think some people have tighter comfort deadbands and others are tolerant of anything. These can be the same person, on different days / differing amounts of sleep / different tasks / different lunches / different music.
i.e. I have a three monitor setup, two 1080 screens and a central 1440p screen. Mechanical cherry blue keyboard and a nice gaming mouse. Big desk. One of those anti-RSI boob mousemats. Nice headphones. Beefy computer. Ergonomic office chair. Everything at the right height. Quiet room. Next to an open window, but which is in shade.
Some days I will happily chug away writing reams of code.
Other days I will get nothing done and every minor annoyance will bring me out of focus; background sounds, glitches in my desktop environment, almost inconceivable hitches in computer performance, my chair feels wrong, the contrast of all text is too high, it's too dark in here, it's too bright, this is all bullshit, I'm so done with computers...
And on other days I'll get even more done, hunched like a Goblin in my living room armchair with my legs tucked beneath me, working through to the early morning, on my tiny 13 inch dell laptop, vaguely aware that I have searing pain* through one wrist, having not eaten for 12 hours, but completely and utterly content and in flow writing code or a paper.
Humans are crazy fickle.
There's no Platonic Ideals when it comes to Crushing It(tm) as a software developer.
I think learning the art of mindfulness and meditation does more to help you focus than expensive equipment fads and micro-optimizing your environment.
* Before anyone mentions, yeah I'm aware RSI is Not Good. I've got a bunch of various remedies for it. Ironically I've found that bouldering helps with this it the most. Just not able to do that during lockdown :(
By 'like them' are you talking about the full set of tradeoffs, or the raw visuals? Because I'd be very surprised if someone actually disliked how an all-else-equal 4K screen looks, compared to 1080.
Yes, the full set of tradeoffs. My comment is unclear. If someone gave me a 4K monitor for free I'd use it, but not having one is not an impediment for me (or most devs I know).
At least for me - I had a cheap 4k monitor. Managing the size of fonts / windows on it vs the connected laptop was a pita. Not to mention the display /graphics card overloading and glitching once in a while trying to power everything.
So I am still stuck with my regular HD monitor + laptop monitor - which is pretty good.
I agree with you, 1080p on a 23-24" screen is great for programming. I have a Acer Predator X34P at home (34" ultrawide 1440p @ 120hz), and for games it is AMAZING. But for programming? I prefer 2-3x 1080p at around 23-24".
In the last 3 months I went from 1 screen to 2 to 3 as I progressed through particular stages of development. When we 'go live' I'll easily be using all 3, but I don't _really_ need them, it's a nice to have.
Yes, you're completely right. I remember when the Atwood article came out, and I was convinced I needed multiple monitors to be productive (I was quite junior) - it even caused me stress when I worked at places that didn't have multiple monitors. These days I'm pretty happy with a single screen - including feeling very productive on just a laptop screen.
It turns out the whole idea of needing tons of desktop space was vastly exaggerated. You can't focus on 3 screens at once, it is impossible. Even on a large screen you can only look at one place at a time. Programmers are focused on one thing for a long period of time anyways - your code editor. Maybe a terminal as well that you need to swap to occasionally. Having multiple monitors has limited benefits.
Where multiple monitors is helpful is if you're an operator - you are watching many dashboards at once to observe if things break or change. Traders for example will have workstations with 9 screens at once. But this makes sense - they are not focusing on one thing at a time, they need to quickly look at multiple things for anything unusual.
I've personally found that using a tiling WM gives me vastly more benefits than multiple monitors ever would. Although I was also pretty happy using Gnome's super-tab/super functionality during my non-tiling days.
4K is sort of a nice to have, but hardly a requirement. I'm as happy on a normal DPI screen as I am on a retina screen. Our brains adapt very quickly and unless the text is overly blurry (because of poor hinting implementations), a regular DPI screen is perfectly fine for coding on. My own personal preference is high refresh rates over anything else - I can't stand latency.
We used to write great code on the original Mac’s 512 by 384 screen, doesn’t mean we were very productive at it.
A 5K monitor allows you to view an enormous set of workspaces and code, and sharply. It will cost you less than $300 a year, which for any decent developer is less than 1/2 of 1% of your annual revenues (and its tax deductible for contractors!).
There is no way the increase on productivity doesn’t far outweigh that cost.
What do you mean "we weren't very productive"? People were plenty productive with the original Mac.
I was plenty productive on PC XT clone with a tiny monochrome CRT monitor and an Hercules video card. (I wanted to say I was plenty productive with my C64, but that'd be stretching it :P )
What I want to fight is this notion that once tech N+1 arrives, then tech N must have been unproductive. That's demonstrably false, a form of magical thinking, and not a healthy mindset in my opinion.
But it's true. Developing modern code on a Apple 2 is unproductive. It's suboptimal, and worse. Maybe you don't want to use the word "unproductive," and that's fine. But larger screens, faster computers, and more expressive programming languages all contribute to increased productivity over the older alternatives.
> Developing modern code on a Apple 2 is unproductive
Are we talking about developing modern code on an Apple 2 now? If so I misunderstood the parent post, because they were using the past tense.
In any case, consider this: how do you feel about your current suboptimal, unproductive hardware & software of choice right now? Your current programming language and monitor resolution cannot be used to code. It's simply not possible to be productive with them.
Think I'm exaggerating? Ok. Fast forward a few years and let's talk about this again. Maybe you'll understand my frustration...
No, I mean, it can. It's just going to (probably) be less productive than what exists in the future. All I'm saying is that the productive/unproductive dichotomy isn't useful and we should instead talk about productivity as a continuum.
> All I'm saying is that the productive/unproductive dichotomy isn't useful
Then we agree! I was pushing back against the notion that one cannot be productive unless using a { 5K monitor; three-monitor setup; mechanical keyboard; $YOUR_FAVORITE_TECH_GADGET HERE }. In other words: it's not "time to upgrade your monitor" ;)
I agree better tools have the potential to make us more productive. I also think some nerd/hacker types tend to overrate the importance of their tools and gadgets, because this stuff serves as a proxy for actual talent and it's easier to show off and compare (These are not my words, someone else said it elsewhere in this thread, but once I read it, it really clicked with me).
It's one thing to say "I'm more productive with $GADGET". It's another, different thing to say "we weren't productive before $GADGET". I assure some of the old hackers, back in the days when they coded on a PDP-11 (or an Apple II, or whatever) where more productive than me and -- without knowing you -- I'm willing to bet they could be more productive than you. In my opinion the mindset that old tech wasn't "good enough", that leads people to claim "we weren't productive before" in absolute terms, is really pernicious and unhealthy.
Mind you, I'm not saying all gadgets are equal. Faster CPUs to me are obviously more useful than a larger screen. Both are nice, but one is more important than the other.
However, the most important peripheral, the one that makes the most impact on productivity: my brain. And I can't show it off as easily as a brand new macbook or whatever.
The cost of buying a new high end laptop is trivial in relation to your work value.
I used to run a dev group of 40 people. Every developer got fastest possible pc with big screen monitor back when both were expensive, and their own office with a door. Added about 1% to the cost of each developer, but the benefit to productivity was at least 10-20 times that.
And Not even counting benefits of reduced turnover.
An Intel iGPU can run dual 4K displays just fine, if it has dual channel RAM.
I'm still upset at Lenovo thinking 16 GB should come in one RAM stick.
If you have a Thunderbolt USB-C port then you have a DisplayPort. Some monitors accept it directly and will act as a charger and docking station. Others need a fairly inexpensive USB-C to DP adapter.
I actually like monospace pixel fonts. Doubt it would make a difference. I have a (cheap) 4k display at home. Some things might look better, but I would recommend to use one with a high refresh rate and good contrast.
Since I used ebook readers with their displays, reading on a monitor is always a compromise in comparison.
Wouldn't want to go lower than 1080 though. It can work of course, but it just doesn't have to be.
For me it all comes down to display size and density, and therefore out of all the monitors I've tried, I'd go 27" at either 5k or 2.5k resolution.
For me 27" is the perfect display size:
- Any bigger and I find I have to move my head to much to be comfortable in the long term and I get a stiff neck.
- Any smaller and I can't fit enough on the screen at once and my productive decreases.
2.5k (1440x2560) on a 27" monitor has pixels at a good size that most apps scale by default beautifully on.
5k is exactly double the density of the already perfect 2.5k, so (at least of MacOS) everything displays exactly as it does with 2.5k except really crisp!
Therefore, I now run the 27" LG 5k as my primary screen, and my old 27" 2.5k vertically on the side for all those status like things that it's good to see but don't require my focus. Also, since the size and "looks like" resolution is the same, I can drag windows between them without any changes and size/scale. It just works!
I don’t mean to pick on you personally, but what a stupid argument that is.
How is more pixels and more legible text not strictly better than lower resolution monitors?
It’s like preferring 10 DPI mice to the modern 1000+ DPI ones, saying that PCIe 4x was better than 8x or actively disliking a modern DSL connection compared to a 56k dialup.
Yours is just an idiotic stance to feel superior than everybody else.
I built the majority of the software for my first startup on a tiny under-powered netbook, since I couldn't afford a more expensive laptop. Today I have a very wide monitor for work. Agree that the display size doesn't seem to matter that much, but you do have to learn how to take advantage of features like multiple desktops, etc.
I used to have #1 with two large-ish monitors and I hated it. Two monitors are effectively a single 2xW monitor and that means moving your head from side to side constantly. Having them X inches above your keyboard means moving your head up and down (I touch type but need to look at the keyboard occasionally).
A laptop means a hi-res screen and a keyboard in close proximity. Instead of moving your head around you can use Spaces.
The second laptop can act as a second screen if you need it — as well as a coffee shop runner and a backup if your main machine is down or getting serviced.
It’s the same cost or less than a powerful laptop and two external monitors, assuming you are buying hi-res.
Another conclusion I disagree with is the author's insistence on only using integer scaling multipliers. The article talks about subpixel rendering and then mostly ignores that for the rest of the article, instead assuming pixels are all small black squares (which it isn't - "pixel density" isn't a particularly meaningful term when you're dealing with clarity on modern screens). IIRC, macOS renders text on a higher-resolution internal framebuffer and scales down, anyway; in practice I don't notice any difference in clarity on my MacBook screen whether running at an integer or non-integer scale of the native resolution.
Had this debate when we bought monitors for the firm. Some in management said high DPI, good color etc. are not worth the money (which you clarify is what your really mean). But after splurging, the effect on productivity was measurable. If you pay someone to stare at a box for 8h, a nicer box pays for itself pretty quickly.
Personally I don’t mind a small UI so I currently just run (macOS) on 3840 × 2160 on two 24” LGs. With size 11 Fira this gives me 8 80char columns.
> The best developers I've known wrote code with tiny laptops with poor 1366x768 displays. They didn't think it was an impediment.
Everytime I read something like this I cringe. Ergonomics is terrible with laptops. There are no osha approved laptops.
Those developers should sit in a good chair, with a properly positioned keyboard monitor and body, and head off the problems their body will encounter. (or is encountered but ignored)
I just refuse to be told that I must use a { three-monitor setup, mechanical keyboard, 4K monitor, macbook, $LATEST_FAD } or otherwise I'm using a subpar development environment.
I still remember when Full HD monitors were all the rage. Nobody considered them cheap tools back then, and somehow code got written and developers were happy. There will come a time when someone on the internet will tell you that a 4K monitor is a cheap tool and that nobody can properly code using one.
I have a 2560x1440 monitor at home and 2x 4K monitors at work. I like my home setup better because the tools I spend most time with (vscode and Xcode) lock all their panels to one window making the real estate more important then the pixel density. I still have not found a way to get a 4K display not to display in “retina” 1080p without it getting too small, blurry or laggy.
Agreed. Was supplied two 4ks for my last job and just ended up using so zoomed in it wasn't even like I was using 4k monitors anymore.
My ideal setup is two 1440p monitors, but even then I tend to zoom in quite a bit.
I also had a coworker who was fine with their laptop screen. People you are finicky with these kinds of things seem to have their focus elsewhere over the actual work done.
> The technical details are all right (or seem right to me, anyway)
No, not universally. If I follow the advice and disable font smoothing, it looks worse than any font display I've ever used, including 8 bit home computers. https://i.imgur.com/Y0xkhTb.png
Aside from the added screen real estate, I find my external 1080p LG monitor simply looks better than my Dell Latitude's. The laptop display just looks... cheaper. It's hard to explain, it just looks worse.
The whole "HD ready" thing was an utterly sad part of the story of cinema/television that let's hope will never repeat.
As for the 1366x768, for some reason for many, many years the wide majority of monitors stopped there, with "Full HD" coming to the masses (but not all) only a few years ago.
"1366x768" meant "half baked HD" from the day it was introduced, so it's understanding that few quality monitors (and especially laptop screens) were made with that resolution.
The author is biased because of ligatures. These special glyphs are a little more dense than normal text, e.g the triple line equals and need a higher res display to look good. I personally dislike ligatures, I have no problem chunking two characters together in my head.
> e.g the triple line equals and need a higher res display to look good.
Do you mean '≡' (U+2261 identical to)? Because no, it very much doesn't; in fact I'm not sure I've ever seen a font where it looked bad short of vertical misalignment with surrounding text.
Okay, tracked down this[0], which is probably what you were talking about(?), and what the actual fuck is wrong with that rendering engine? You render a Apx horizontal line, Bpx of space, a second Apx line, Bpx of space, and the third Apx line, where A=B=1. Maybe you use 2px of space. Maybe, if you're rendering obnoxiously huge, you use a 2px line and however many pixels of space. You don't fucking use real-valued vector coordinates and then blindly snap them to integers, or smear bits of gray anti-aliasing around the screen like some kind of monkey with a hand full of it's own feces.
If OSX is actually doing stupid shit like that, it's no wonder sane-resolution moniters look like crap. Stupid-absurdres monitors will look just as bad, because the problem is the operating system driving them, not the monitor.
I'd take a 2560x1440 monitor (or 2 or 3!) over 1080p displays that the author seems to prefer, even if those 1080p display are actually 4k pixel-wise...
Being able to see more things >>>>>>> sharper text, once you're past 100dpi or so at least.
Are you talking about physical screen size, DPI or what? I've no idea what display a MacBook Air 11" has.
Did your eye doctor tell you your myopia developed because of staring at your laptop's screen? (Mine developed when I was a teenager and hadn't yet spent a lifetime of staring at CRT monitors, and my eye doctor told me it was likely unrelated to computer screens. But maybe medicine has changed its opinion, this was decades ago).
You as a coder also don't normally need 120 Hz display like the OP and to be limited to !3! models currently available on the market. It doesn't make sense. But hey, economics, whatever...
I love the old CRT text mode. I don't need fine anti-aliased font for my text, I find those fine fonts are too much for me. What's wrong with some non-perfect edges?
Oh, no argument from me. I said I knew plenty of people I admire, and who are way better than me profesionally speaking, who can code with 1366x768 displays and dismiss the urgency of getting a better screen. In this sense I meant that it's not time to upgrade your monitor: it's simply not that urgent to own the highest DPI screen you can buy.
Me? I can certainly code with a poor quality 1366x768 display (doing it right now with this Dell Inspiron!) but I hate it. I'm not a 4K snob, but I do like better screens!
I started coding in Windows Visual Studio again recently after decades of absence, and the enormous length of their variables, function names, and function parameter lists, has me >120 columns again. All my non windows code is 80 (or 100 in very rare cases).
So now I need a wider monitor. I was on an NEC 6FG until 2011 and then went to a small dell flat panel 1280x1024. I suppose it is time to join the 2020's.
I have multiple VNC and RemoteDesktops up at any moment, so maybe I'll really splurge and go for one of those fancy curved ones.
This article is written in a style i can really appreciate. That's a lot of research.
1440p, 75hz display is perfectly acceptable. If you game, the quality difference between 1440p and 4k is pretty much indistinguishable, but 1440p is much less performance intensive to render.
I upgraded from a 32" 1080p display to a 32qk500. A significant improvement in visual quality for a reasonable price (~USD$270)
I've found my journey through tooling evolution to be fascinating (to me at least).
When I was new and coding was slow, my worries were about so many other things (like how the language even worked) than even what text editor I used, let alone the monitor.
As I became experienced, a lot of newbie problems went away and the next topmost problems emerged, leading to be obsessing over text editor, shortcuts, and even a multi-monitor setup with a vertical monitor.
Then went through a minimalism phase where I was annoyed by how much time I was spending maintaining my tools rather than using them, so gone went almost all of that. Just one giant monitor and VSCode for better or worse (mainly because it does all six languages I use well enough).
I'm now at a phase of thinking, "who are all these people who do so much coding in a day that these things matter?" because reading and writing code is maybe.... 35% of my job now? What I need to optimize for is reading/writing human text. And that's where I currently am: figuring out how to optimize writing documentation/design/architecture docs, given how awful making and maintaining a sequence or block diagram is currently.
My conclusion so far is that I expect my needs to continue to mutate. I do not believe they are "converging" and do not believe there is any sort of golden setup I will one day discover.
And there is another phase - healthy: flicker free monitor, ergonomic keyboard, stand-up desk so on. When are these 27" e-ink monitors?
And yet another phase - no-screen. Everyday I look forward to avoid screen-time as much as possible - more face to face time, more water-cooler chats, figuring out the solution with pen and paper or coding in head during outdoor walk.
I loved it too on a mac. When I switched companies and went to an Ubuntu-based laptop, trying to keep the whole monitor scenario working consistently was frustrating, so I dropped it.
To echo the sibling comment: `arandr` makes it easy to save monitor configurations (as a script). Then you can run that script on startup. I.e. for Xorg I put it in my ~/.xinitrc.
Concerning the smoothing, I presume this is macOS’s glyph dilation. pcwalton reverse engineered it in order to replicate it in Pathfinder (so that it can render text exactly like Core Text does), concluding that the glyph dilation is “min(vec2(0.3px), vec2(0.015125, 0.0121) * S) where S is the font size in px” (https://twitter.com/pcwalton/status/918991457532354560). Fun times.
Trouble is, people then get used to macOS’s font rendering, so that (a) they don’t want to turn it off, because it’s different from what they’re used to, and (b) start designing for it. I’m convinced this is a large part of the reason why people deploy so many websites with body text at weight 300 rather than 400, because macOS makes that tolerable because it makes it bolder. Meanwhile people using other operating systems that obey what the font said are left with unpleasantly thin text.
I've blocked font-weight: 300 in my browser because of this permanent macOS bug. It's funny because people now abuse a thoughtfully-designed CSS property to disable this dilation on webpages, to the point that that CSS property is used exclusively to account for macOS's incorrect font rendering and is inert on other platforms.
How do you block font-weight: 300? On Windows I uninstall thin fonts, but on Linux, Firefox doesn't respect my fontconfig settings to avoid using thin fonts.
I tried turning off smoothing and frankly the text was uncomfortably thin to read. I guess maybe their font should just be bolder by default so you don't need the glyph dilation to compensate.
I didn’t actually know you could turn it off before today! I’ve never had a Mac, I’m used to FreeType and Windows font rendering.
I have a vague feeling that Macs also have somewhat different gamma treatment from other platforms, which would be likely to contribute to thickness perception.
In Leopard that font smoothing preference wasn't a checkbox, but a drop-down menu specifying how much font smoothing. If you select "Strong" you get a result that looks even bolder than normal. Even when a website is using an extraordinarily thin font, this setting makes it readable.
Apple replaced the drop down menu with a checkbox in Snow Leopard, but the underlying functionality wasn't removed until Catalina. Naturally I was very disappointed after my upgrade to Catalina because all text looked too skinny to me.
this is interesting, thank you! And font-weight: 300 has been an old personal nemesis. That explained at least part of it! Do you know if font dilation is on by default?
Maybe this was done to emulate font bleed that happens in the printing process. It's a common problem that latex PDFs look too light, because the build int fonts where copied from mechanical type, which assumed the ink would bleed on the page.
Eh, I think everyone serious about typography agrees that Computer Modern is just unreasonably thin. This is easily enough for me to consider it a lousy font that should literally never be used on screen, and I’m dubious about usage on paper. Unfortunately stockholm syndrome applies.
Do all that, of course. But when you’re writing applications for other people, go buy a shitty middle of the pack laptop, resolution 1280x768, with the windows start bar and menu bars eating like 150pixels of that, and try to use your software there.
While we're on it, make sure this is a low end laptop in terms of components too. Make sure your application is usable. No one needs a chat client that eats several gigs of RAM and a significant portion of your CPU.
> A good monitor is essential for that. Not nice to have. A MUST. And in “good” I mean, as good as you can get.
The whole discussion starts from a what I would say is an incorrect assertion, and then goes on to describe lots of ways that letters can be made better. If, in fact, you don't need better, then a lot of the points go away.
1. There is a point of diminishing returns, past which you are spending a lot more money for very little benefit.
2. There exists a point beyond which "better letters" are unlikely to contribute much to daily work.
Both of those points are, to some extent, the same point. But either way, the idea that you MUST get as good of a monitor as you can" is, in my opinion, untrue and not worth basing an entire document on.
I'd rather see a discussion of which features of a monitor contribute the most (per $) to how well they function for daily work. For gaming, I want high refresh rate and high contrast. For TV, the contrast (real black) goes up in importance. For daily work, neither of those is a huge contributor to how well I can work.
While technically the article is correct, it feels very narrow-sighted to me by placing resolution/refresh rate above everything else. Some more things to consider:
- Price. If you are not a programmer in a first world country working for a magic money startup/big 4 then you wont spend $1000+ on a single monitor. Mine was a better 27" 1920 one for $300 and I'm happy with it. I'm sure there are better ones, but I cant allow myself spending much more on one.
- No PWM. This is crucial. PWMing monitors kill your eye in hours, then you wont be able to work for the rest of the day. Some ppl I know take the eye-protection game so far that they by e-ink screens for development (sadly you cant buy large ones.).
- Adjustable, optionally can be rotated 90 degrees. If you got 2 monitors one rotated at 90 degrees is a blessing for reading code.
- small bezels. Quite important if you got 2 or more, large bezels make my setup clumsy.
- Bright. Its quite annoying if I cant read something because the sun shines in my room. Especially important for laptops.
None of these are mentioned, just that small letters should look nice and scrolling should be smooth.
PWM and flicker need to die. I easily tell when a monitor is flickering and I'm convinced they're bad for long term eye health after using one for multiple years and not knowing why my eyes started hurting so often. Refresh rate is a nice-to-have in my book above 60fps. Size and resolution are much more important. Having a 32" over a 24" changed my life.
Affordable, large E-ink screens would be revolutionary for many occupations and I hope there's more investment in the space.
Macbook Pros use PWM, which is why I had to return the latest 16 inch model. I was really bummed; but I need to protect my eyes. It was causing severe eye strain.
After using a rMBP for 6 years, I realized that using a lower resolution and lower quality display makes absolutely no sense at all if both graphical power and budget is available (first 13' rMBP had some serious issue driving the display). Better quality image is better quality image. I think Apple's biggest selling point over any vendor right now, despite numerous issues with its hardware and software in the recent past, is absolutely top class input and output. It is such a simple concept. A great keyboard (seems to be fixed now) and absolutely incredible trackpad experience along with a display that basically is a huge step up from your past experience means that most users will prefer that setup even if they just use it for basic coding or web browsing. After looking at the first retina displays I realized that Apple didn't just change the displays but it changed how fonts behaved completely because the crisp and clear legibility was key to attract customers early on. I'd say even in 2020 most computers are struggling with good displays which can completely ruin the experience for someone using the product even if every other aspect of it was great.
I've been focused on Apple display products for the past few months as I'm looking to make an upgrade from the Dell P2715Q 4k 27" I use primarily for development.
There is a three page thread on using the Apple 32" XDR for software dev on MacRumors. [1]
I believe there is a major product gap at Apple right now in the mid-market display. Specifically a replacement for the LED 27" Cinema Display which was announced 10 years ago next month. [2]
I am speculating that Apple could announce a new 27" 5k in part because of the rumored announcement of a new Mac Pro but also because the build quality of the LG 5k Ultrafine is just not great and there are obvious synergies with XDR production and Mac Pro.
I think this should be announced at WWDC is because developers specifically are being left out of good display products and Apple should be looking out for what they stare at all day.
While there are no supply chain rumors of such a display, I wargamed what this product might be and its pricing anyway.[3]
In short, I speculate Apple will release a 27 inch IPS display, 5120 x 2880 @ 60Hz with standard glass at $1999, Nano-texture Glass at $2799.
I had not paid a lot of attention to the refresh rate, but it does seem like kind of a miss that the XDR does nor offer this.
I have a Dell P2715Q and I'm happy with it. I haven't tried anything with a higher resolution or DPI, but I can't see any individual pixels on the Dell, so for the moment, it's good enough for me.
I can't go back to sub-4k though. Looking at a 24" 1920x1080 monitor tweaks my brain. Pixels ahoy. It's jarring. I'm not the kind of person who cares about superficial things or style or brand at all, but I just can't get comfortable with sub-4k anymore.
Be very wary of Apple monitors. If you can, try one out in the environment you intend to use it in, before you commit. Apple displays are highly reflective. The glare is obscene. The display quality is great, but I can't deal with the eye strain. It's like there's a mirror glaze on top. They used to offer a matte option, but I don't believe they do anymore. It's painful.
This is an important point. I have a pal with an aging 5k imac that he wishes he could use only as a monitor with a new mac mini.
It seems display tech in imac is severely discounted in order to sell the whole thing. And it does seem to break the pricing if you don't see them as different products offering value for different configurations.
Bringing back target mode, or allowing the iMac to act as an external monitor would greatly increase the value of that product.
I can't explain how this pricing makes sense exactly, except that I can only assume Apple will want or need to price this stuff high to help differentiate the build quality in comparison to the collaboration on LG's ultrafine.
I bet somebody could make a business refurbishing those old 5K iMacs into usable monitors. Either yank the panel into a new case w/ a custom board to drive the display, or hack an input into the existing system (maybe with a minimal booting OS).
Either way, that'd be a cool product to see and seems like a decent side hustle to get some $. :)
I recently bought a 2015 27” 5K (5120x2880) iMac core i5 for cheap, put in an SSD, and upped the RAM to 24 GB. It handles everything I can throw at it as a developer (albeit not a game developer, just “full stack” web/java/node/k8s) and the screen is just incredible.
The SSD upgrade is not for the faint hearted, however. I used the iFixit kit with the provided “tools”, and lifting off 27” of extremely expensive and heavy glass that sounded like it might crack at any moment was not exactly fun. Having said that - I would do it again in a heartbeat to get another computer/display like this for the price I paid.
With regards to scaling: I have had zero problems with either my rMBP (since 2013) or this iMac, when in macOS running at high resolution. Zero. As soon as I boot Windows or Linux, however, it’s basically unusable unless I turn the resolution down.
I've got a Planar IX2790 and it's great. Article is spot-on re. scaling -- it's much nicer to look at all day (native 2x scaling) vs. 4k at 1.5x scaling.
Thanks for this idea. The design looks surprisingly like the old cinema display. Actually, apparently they use the same glass from the cinema display but no camera. [1]
It looks like the main concerns on this are around stuck pixels. Have you gone through calibration / QA on yours? [2]
Otherwise, seems like a compelling alternative to the Ultrafine.
I believe the theory that it's 5k iMac panels that Apple rejected. I have a few stuck pixels but I never notice because they're so small. I have a blue stuck-on in the lower-right quadrant and I can't even find it now.
I kind of thinking such Display's "Design" could overlap with the new iMac. One reason Apple used to having a "Chin" in the iMac was to distinguish it as a Computer and not a Monitor / Cinema Display. Judging from the leaks, New iMac would not have a Chin at all, and since there is no similar sized Cinema Display in the Line up this doesn't really matter.
I just wish they bring back Target Mode, or something similar to iPad's SideCar.
- Proved and debugged on the XDR release at the pro price point.
- Kept working on how it can be cheaper and fit into plans for whatever the ARM-based machine's initial graphics capability will be.
- Designed it to use a similar manufacturing line to the iMac then offer it at a lower price point to support the current mac pro, mac mini and a possible dev kit for the arm mac.
Or I suppose just keep making everyone buy the LG 5k ultrafine that is four years old. :P
I have just got a U2720Q, it worked at 60Hz right away. But I am connecting over USB-C, with the provided cable.
What it did find off-putting is that Dell say they don't test with Macs, and thus can't support them, but they just assume that Appl follow the same specification they do. A bit weak, but the monitor does work fine.
Thanks for this idea. The product page says it is 218 PPI, but I don't see how that is possible given it is 34" wide. Pixensity says teh actual PPI is 163.44. Can you confirm the PPI on this monitor?
It's 163. I think the 218 you got from a Cmd-F and picked up the SEO blurb for the 5k ultra fine in the footer).
I cannot tell the difference between the LG and my MacBook's retina screen at my normal viewing range, but this may just be my eyes.
I also use a non integer scaling factor on both screens as I find it has the right combo of resolution and real estate for me, and I don't notice artefacts.
I honestly find MBP trackpads to be too big, which is admittedly a preference thing, but their keyboards are absolutely horrible, and I have difficulty understanding how anyone could think they were "great". There's not enough distinguishing keys from each other, so I can't ground myself to the home row. I can't think of many keyboards I've used throughout my life that I enjoy less than the MBP. And that's not even mentioning the quality issues (duplicated or broken keys), or the lack of Fn keys.
For the trackpad I agree, but definitely not for the keyboard. And apple was late(!) to hires screens and to hidpi and now has lower res and density than the competition (e.g. 16:10 better 4k on dell xps or 3:2 screens with MS and others).
Also, apple had TN screens forever when the similarly priced competition had higher res IPS screens.
Isn't 4k on a laptop a significant power drain? And isn't the point of "stopping" at Retina resolution that the human eye can't tell the difference between Retina and higher resolutions like 4K at typical laptop screen size and viewing distance?
>"Of course, the real highlight is that new Retina Display. Its resolution is 2,880x1,800 pixels, providing a level of detail never seen on a laptop before. The highest standard Windows laptop screen resolution is 1,920x1,080 pixels, the same as an HDTV."
>The signature feature of the new MacBook Pro is the new 15.4-inch (39.11 cm diagonal) Retina display. [...] So far, no notebook screen has topped resolutions of 1900 x 1200 pixels (WUXGA) or 2048x1536 (QXGA in old Thinkpad models).
If you manage to dig up some obscure laptop that had a higher resolution at the time I wouldn't be completely surprised. However, to suggest that Apple was "late(!)" with high DPI screens is provably false and frankly, ridiculous.
I had a 1080p Alienware laptop in 2004. Then for some reason, laptops all went even lower resolution for a long time, and I couldn't find a good one until Apple came out with their Retina displays. Not sure what happened. Manufacturers just became cheap?
I've found the keyboard on my Surface Book to be much better than the keyboard on any of the MBPs I've used and owned.
The trackpad was just as good, too. But of course, the OS was worse, mostly in terms of performance. Windows 10 just always feels slow for some reason.
I still use a CRT, because features I want are still ludicrous expensive in newer tech (although they are getting cheaper over time).
1. Arbitrary resolutions, great to run old software and games, even better to run new games at lower resolution to increase performance.
2. Arbitrary refresh rates.
3. Zero (literally) response time.
4. Awesome color range (many modern screens still are 12bit, meanwhile silicon graphics had a CRT screen in 1995 that was 48bit)
5. No need to fiddle with contrast and brightness all the time when I switch between a mostly light or mostly dark content, for example I gave up on my last attempt to use flat panel because I couldn't play Witcher 3 and SuperHot one after the other, whenever I adjusted settings to make one game playable the other became just splotches (for example the optimal settings for Witcher 3 made SuperHot become a almost flat white screen, completely impossible to play).
6. For me, reading raster fonts on CRT gives much less eyestrain and is more beautiful than many fonts that need subpixel antialias on flat panels.
7. Those things are crazy resilient, I still have some working screens from 80286 era (granted, the colors are getting weird now with aging phosphors), while some of my new flatpanels failed within 2 years with no repair possible.
Unless you're using a really cheap screen, most LCDs are 8 bits per color. The cheap ones use 6 bits + 2 bits FRC. 12 bits (presumably 4 bits per color) seems insanely low because it would mean only 16 possible shades per primary color.
>meanwhile silicon graphics had a CRT screen in 1995 that was 48bit
Source for this? 10 bit color support in graphics cards only became available around 2015. I find it hard to believe that there was 48 bits (presumably 16 bit per color) back in 1995, where 256 color monitors were common. Moreover, is there even a point of using 16 bits per color? We've only switched to 10 bit because of HDR.
>7. Those things are crazy resilient, I still have some working screens from 80286 era (granted, the colors are getting weird now with aging phosphors), while some of my new flatpanels failed within 2 years with no repair possible.
This sounds like standard bathtub curve/survivor bias to me.
I am talking about SGI workstations, indeed the 1995 ones didn't support (without modification) 48bit, instead it was "only" 12bit per channel, 3 channels, thus 36bit.
> 10 bit color support in graphics cards only became available around 2015.
That's off by a decade.
> 256 color monitors
Is that a thing that exists?
> We've only switched to 10 bit because of HDR.
You can get clear banding on an 8 bit output, and 10 bit displays are used at the high end. 10-bit HDR isn't immune to banding, since most of the increased coding space goes into expanding the range. There's a good reason for 12 bit HDR to exist.
So it looks like it supported 10 bit frame buffers, but not necessarily 10 bit output. A quick search suggests it only supported DVI, which didn't support 10 bit output. In the context of talking about monitors, this essentially means that 10 bit wasn't supported at the time. Otherwise you could claim you have 192 bit color by software rendering at 64 bits per color, but outputting at 8 bits.
Consult the dot pitch of your monitor for the actual resolution. Everything is “scaled” to that resolution. Of course, the algorithm is physics, which is much better than cubic interpolation, so it does look slightly better, but a modern hidpi lcd will provide a higher dot pitch and thus a sharper, more accurate picture.
Color CRTs do have something akin to a native resolution too, defined by the phosphor arrangement, so they do "scale" things. It just happens that the scaling is naturally blurry and artifacts aren't noticeable.
You raise some very interesting points. I've appreciated the physical lightness and ease of positioning of LCDs, plus the absence of flyback whine. And they can go to higher resolutions than the scan circuitry in a CRT can physically manage.
But all those other things, yes the color resolution, the smoother fonts, the response time. I might have to swing by the recycler and pick up a nice "new" Viewsonic. :)
3. 60hz CRT refresh rate is 16.67 millseconds of delay. Interestingly, I connected a VT220 to my 56' 4k Samsung TV via a BNC-to-RCA cable, and by comparing the cursor blinking on both screens there's a very noticeable and visible delay, like a 1/4 second.
4. CRTs are analog. It's as many bits as your output hardware can make up.
5. CRT is still supreme for contrast (at least over LCD) despite all the tricks and such for LCDs.
7. That VT220 I mentioned above is from 1986. It's monochrome, but works great.
> CRTs are analog. It's as many bits as your output hardware can make up
Actually the horizontal resolution of a CRT is limited by: the dot pitch of the colour phosphors, the bandwidth of the amplifiers, and the electron beam spread.
The vertical resolution is limited by a combination of: electron beam scan rate, delay for horizontal flyback/retrace, delay for vertical flyback, desired refresh rate, and electron beam spread.
There are more details for the resolution limitations, but I think I covered the main ones.
I kept a CRT for quite a while but when I switched, I realized I didn't miss it.
1- True, if there is a thing I'd miss, that's it. At low resolutions, CRT sometimes get really nice refresh rates too (I've seen up to 18O Hz).
2- Modern gaming monitors have freesync/g-sync that not only give you arbitrary refresh rates, but they are also adaptive.
3- Also true, but not as significant as one might think. The monitor itself is zero latency, but what's behind it isn't. We are not "racing the beam" like in an Atari 2600 anymore, the image is not displayed before the rendering of a full frame is complete. The fastest monitors are at around 144Hz, that's 7ms. So for a 1ms gaming monitor to a 0ms CRT, you actually go down from 8ms to 7ms, to which you need to add the whole pipeline to get "from motion to photon". In VR, where latency is critical, the total is about 20ms today. More typical PC games are at about 50ms.
4- CRTs are usually analog. They don't use "bits" and it is all your video card job. Also 48bits is per pixel, 12bits is per channel. Apples to oranges comparison. CRTs do have a nice contrast ratio though, good for color range. Something worth noting is that cheap LCDs are usually just 6bit with dithering. True 8bit is actually good and I'm not sure that you can actually make a difference passed 12bits.
5- Never noticed that, maybe some driver problem. An interesting thing is that CRTs have a natural gamma curve that matches the sRGB standard (because sRGB was designed for CRTs). LCDs work differently and correction is required to match this standard, and if done wrong, you may have that kind of problem.
6- I hate text on CRTs. And unless you have an excellent monitor (and cable!), you have tradeoffs to make between sharpness, resolution, and refresh rate. And refresh rate is not just for games, below 60Hz, you have very annoying flickering. I wouldn't go below 75 Hz. And at max resolution, it can start getting blurry: the electron beam is moving very fast and the analog circuitry may have trouble with sharp transitions, resulting in "ringing" artifacts and overall blurriness. One old games, that blurriness becomes a feature though, giving you some sort of free antialiasing.
7- Some CRTs are crazy resilient, others not so much. Same thing for LCDs. And as you said, phosphors wear out, that's actually the reason why I let go of my last CRT (after about 10 years). My current LCD is 8 years old and still working great, if fact, better than my CRT at the same age (because it doesn't have worn phosphors).
Sadly, the CRT manufacturers back then when it was still obviously superior to LCD and Plasma, decided to kill it early, SONY was still selling CRTs faster than flat panels when they shut down their factories.
Many people today think that noone make CRT because noone buy it, but is the opposite, you couldn't buy CRTs anymore because manufacturers intentionally killed them.
There was even research ongoing at the time to make a slim flat panel CRT, but they cancelled that too.
CRT due to being analog, doesn't support DRM, thus this contributed a lot to its rapid death. (among other reasons).
People still use CRT in some arcade machines, medical industry (for example to diagnose certain visual disorders, and a plasticity research I know, require zero update lag, thus only possible with CRT), industrial applications where flat panels are too fragile and whatnot, but all of these are basically buying existing CRTs, driving up the price.
There was some people trying to restart production, but... the companies that have the patents refuse to sell them, the companies that know how to make them also refuse to sell the machinery and whatnot, and the few independent attempts failed often due to regulations.
Also in USA someone invented a machine to recycle CRTs, and ended being shut down due to regulations too, so in USA because of regulation instead of safely melting glass and lead, the law basically says to dump it all in landfills.
My personal workstation has a Samsung Syncmaster that I don't have model number available now. The maximum resolution is around what people call 2k, but 60hz, that I don't like. My current resolution is 1600x1200 75hz.
Reminds me of a Dell CRT I used to use - 1600x1200@80Hz, and 21" IIRC. Every monitor I've had until fairly recently has felt like a compromise in some way compared to it, but taking a CRT to university was a non-starter.
HD-DVDs playing on my Xbox 360 looks looked glorious on it (Xbox 360 supported 1600x1200 output, as did some games IIRC, and any widescreen content would just get letterboxed.
Regarding the "120Hz dance", which sure is ridiculous, the author could probably give the nice little tool "SwitchResX" a try. I adore that piece of software because it allowed me to force my MacBook to properly supply my Dell U2711, which is a 2560x1440 display, with an actual 2560x1440x60Hz signal over HDMI (which was important to me because I needed the DP port for something else).
That older monitor has some kind of firmware bug or maybe it's a wrong profile in MacOS or whatever, which makes it advertise a maximum of 2560x1440x30Hz or 1920x1080x60Hz to the Mac when connected via HDMI (DP works fine out-of-the-box), effectively preventing me from choosing native resolution at maximum refresh rate. I haven't been able to make MacOS override this limitation in any way using the common OS-native tricks, but SwitchResX can somehow force any custom resolution and refresh rate to be used, and the monitor is apparently able to deal with it just fine, so I've been running this setup for years now with no complaints whatsoever.
Also no manual work was ever needed after display disconnect/reconnect or MacOS reboot. I had problems once after a MacOS update, which required a SwitchResX update for it to be working again, but other than that I'm in love with this nifty low-level tool.
I had a similar problem with an older MacBook that would only (officially) power my 4k display at 30Hz. SwitchResX was the magic solution to bring it up to 60Hz.
I found this article utterly baffling. The author clearly knows their stuff, having created Fira Code, but my experiences couldn't be more different.
I spend most of my days in a text browser, text
editor and text terminal, looking at barely moving
letters.
So I optimize my setup to showing really, really
good letters.
I certainly appreciate how nice text looks on a high DPI display.
But for coding purposes!? I don't find high DPI text more legible... unless we're talking really tiny font sizes, smaller than just about anybody would ever use for coding.
And there's a big "screen real estate" penalty with high DPI displays and 2X scaling. As the author notes, this leaves you with something like 1440×900 or 1920x1080 of logical real estate. Neither of which is remotely suitable for development work IMO.
But at least you can enjoy that gorgeous screen and
pixel-crisp fonts. Otherwise, why would you buy a retina
screen at all?
It's not like you really have the option on Macs and many other higher-end laptops these days. And I am buying a computer to do some work, not admire the beautiful curves of a nice crisp font.
So anyway, for me, I chose the Dell 3818. 38", 3840 x 1600 of glorious, low-resolution text. A coder's paradise.
For purposes of software development, I won't really be interested until 8K displays become affordable and feasible. As the author notes, they integer scale perfectly to 2560×1440. Now that would rock.
Legibility is a funny thing. When you're paying attention, your standards for legibility will be very low. I was totally happy to read text on a 1024x768 17" MAG monitor for years. When you're paying attention to something else, like designing the code you are about to type, I think crispness and clarity of text absolutely matters. Microsoft Research did a lot of work on this when they released ClearType. They seemed to believe strongly that clearer text measurably improved speed to recognize characters.
I'm in my 40s and my eyes aren't great. But I have zero issues with legibility on the non-high-DPI Dell 38".
Depending on font choice (typically I use Input Mono Compressed) I can fit six or seven 80x100 text windows side-by-side on the Dell with excellent legibility.
That is up to 800 total lines of code and/or terminal output on my screen at once.
That really is my idea of coder's paradise. You, of course, are entitled to your own idea. No two coders like the same thing. Ever.
Well, it's really about personal preference and while I know a couple of collegues with your preference, I feel it's the minority.
I take two 16:9 screens over one 3840x1600 screen anytime. No need to setup some additional inner-mintor split-window management. Split-monitor management and workspaces works very well and I can even turn the 2nd monitor off, when I don't need it and want full focus mode (i.e. reading). Also I prefer my main monitor exactly in the middle in front of me and an actual 2nd monitor to the right. If I have the luxury of a 3rd monitor, it's to the left (usually the laptop that powers the 2 big screens). Setting one half of an ultra-wide in the middle just feels wrong. And splitting in 3 is too small for me and again the inner-monitor window management issue.
While I also strongly believed my old 1920x1080 24" (~92ppi) screens were good, I had the opportunity to use qhd 27" (~110ppi) screens for 3 months abroad and I was baffled when going home how incredibly bad text looked on my old 24" monitors.
The benefit of a 3840x1600 screen over two monitors with lower resolution is that 3840/2 = 1920 and 3840/3 = 1280. Those are horizontal resolutions for 1080p and 720p respectively. The fact that these are standard resolutions means basically every app works as designed regardless of whether you have 2 or 3 windows side by side. This isn't true for ultrawides with resolutions like 3440x1440 that don't divide cleanly into other standard resolutions.
The default software that comes with the Dell mentioned above handles everything. If I want to simulate two 1080p monitors, I just do the standards Windows drag to the side of the screen. If I want to simulate three 720p monitors I can press shift while dragging and that tells the Dell software to take over. It is more versatile than having individual monitors.
> The default software that comes with the Dell mentioned above handles everything.
Does it really handle everything? Does it handle all (or even just a single one?) of the use cases I mentioned in my comment and that I care about. Can I turn of part of the screen, if I actually only need smaller space? Can I position a 1920 default width space right in the middle in front of me without it looking weird (i.e. symmetry of screen hardware borders)? Are workspaces working correctly (in unix, windows, macos, all of them?). Just splitting monitors isn't everything, I heavily use workspaces to switch between context.
If all of this works (except the obvious middle problem that's phyiscally impossible to solve) I might actually consider it.
It does for my use case, but obvious your mileage may vary.
>Can I turn of part of the screen, if I actually only need smaller space?
Sort of. The monitor can split into dual source mode that has two 1920 sources side by side. You could potentially turn that on and set one side to an empty source. You can also always use a black desktop if you need the rest of the monitor to be dark to allow you to focus on whatever window you have open.
>Can I position a 1920 default width space right in the middle in front of me without it looking weird (i.e. symmetry of screen hardware borders)?
How do you accomplish this with two monitors currently? You would have to choose between symmetry or having one monitor front and center. The Dell software allows you to customize the drag and drop regions. I use three columns that are the full height of the screen. You could set it up to have a 1920 section in the middle with two 960 columns on each side. You could also setup your physical workspace so one side of the monitor is centered in your vision instead of the center of the monitor. Also I have mine mounted on a monitor arm that allows me to reposition it as needed.
> Are workspaces working correctly (in unix, windows, macos, all of them?)
It works in Windows and that is the only native GUI I use. Everything obviously works fine when in the terminal and I rarely increase the resolution of a VM past 1920. I would frankly be shocked if there wasn't similar software available for Linux and OSX that allowed you to setup customized zones like Dell's software if you need to run one of those natively.
That Dell tool sounds a lot like Microsoft's PowerTool FancyZones. Have you tried FancyZones? It can optionally take over the Win+Left/Right shortcuts from Aero snap (the drag to side/quadrants tool built into Windows).
I've been drooling over the Samsung curved ultra-wides since like January as a possibility for my gaming desktop. In March one of my gaming desktop's monitors blew so I've been done to just one monitor and started to use FancyZones and regular usage of FancyZones has got me much more convinced I'd be happy with the ultra-wide, now I'm just hesitant for merely financial reasons.
I haven't used FancyZones. I will check that out, thanks for the tip.
The financial aspect of this is definitely the toughest part to justify. A single monitor with this resolution is always going to be more expensive than multiple smaller monitors. The cost was justified in my experience, but that will vary depending on your use cases and budget.
Yes the financial aspect isn't easy to get past right now, and the curved ones especially right now are a premium cost just because the option is still so new, but the curved ones do about what I tried to do in manually angling my previous dual monitor setup with added advantages in gaming (because I could actually use the center point and periphery in game rather than the center being bevels and in the way if I tried to span games across both monitors). Plus, all the usual financial concerns from the current state of the world and everything that has been going on.
This is, of course, an issue of personal preference and I do not work for a monitor manufacturer so I'm not trying to talk anybody into buying anything hahaha.
Also I prefer my main monitor exactly in the middle
in front of me and an actual 2nd monitor
I don't understand how dual 16:9 screens help with this! But I agree with you that I hate having a "split" in the middle. I need my main monitor centered.
My monitor arrangement is:
- Dell 38" ultrawide (centered in front of me). Work happens here, obviously. MacOS virtual desktops, while not the most feature-heavy, cover my needs well enough here. Of course I respect that some people lean into virtual desktops/workspaces harder and need more.
- Compact ("portable") 15" 1080p monitor, centered in front of me under the ultrawide. This is essentially dedicated to Slack and my notetaking app. This leaves the Dell at a decent ergonomic height at my eye level.
- Laptop off to the side, for nonessential stuff - typically music app, sometimes Twitter or news feeds
I have a Dell 38" 3840 x 1600 Ultrawide and it is bliss. I don't think of it as two screens, I think of it as three. I can comfortably have three applications displayed side by side with no seam down the middle. For me, it isn't about the crispest font I can get. It is good resolution and tons of real estate.
Do you use it with Windows or MacOS? If it's Mac, do you experience any compatibility issues? I was researching quite a bit into this Dell 38" in the last few days and discovered that it does not have official Mac support, and it seems to have some issues with the USB-C connection.
That Dell would be the screen I would buy right now for coding. I am currently using a 5k iMac at home and a 30" Dell (2560x1600) at work. I really loved the resolution of the iMac, but for my work, I need screen estate. The font rendering on the 30" at 1x scaling is good enough, and having a lot of screen estate is essential for me. Having 50% more horizontal space would of course be great :).
The new 32" Apple display would be great of course, but the price is just off. For coding, I don't care how much reference quality the color setup is. My only hope is, that Dell soon offers an affordable display based on that panel.
I definitely have zero regrets about the Dell. Some of the best money I've ever spent. Depending on font choice (typically I use Input Mono Compressed) I can fit six or seven 80x100 text windows side-by-side on the Dell with excellent legibility.
I can't imagine anybody justifying the cost of that 32" Apple display for coding either. I can't even imagine many high-end creative professionals justifying that.
I mean, on one hand... if a person figures they'll get 5+ years out of that Apple monitor and they work 5 days per week... that's less than $4/workday for a $5,000 monitor. From that perspective it's somewhat reasonable. But most people would probably get more benefit from picking a $1000 monitor and spending that $4000 elsewhere.
The funny thing is, I would even think about the 32" Apple display, if Apple made a computer to go with it. But the new Mac Pro at the entry price is 2x of its predecessor and really not a great computer at the entry level specs. It is amazing fully loaded, but I would rather get a Tesla :p. Not sure how well the 32" Apple is supported by Linux :).
Do you use it with Windows or MacOS? If it's Mac, do you experience any compatibility issues? I was researching quite a bit into this Dell 38" in the last few days and discovered that it does not have official Mac support, and it seems to have some issues with the USB-C connection.
What's with the downvotes? HN is turning into Slashdot or Reddit, and that's not a good thing.
I clearly stated my point. Maybe you disagree. However, the downvote button is not a "disagree" button. It's for posts that actively detract from the discussion.
We need some kind of meta-moderation to ensure frivolous downvoters lose their downvote privileges.
I personally agree that the downvote button should be reserved for low quality comments rather than disagreement and that the diverse and high-quality comments are what makes this site and community awesome. Unfortunately neither the FAQ nor the Guidelines state anything about how to use the vote buttons. So how should HN users know when to use them? Is there even consent anymore (or ever has been?) that downvotes should not be used for disagreement? How did I form the opinion that downvotes should be reserved for low quality rather than disagreement? Somehow along the lines of contributing enough to this site to reach enough karma to downvote.
Maybe the guidelines should add a section on how to vote. On the other hand, how can this really be enforced?
It's a sad thing. I also stopped posting opinions that might trigger disagreement from a large majority.
However, the guidlines clearly state:
> Please don't post comments saying that HN is turning into Reddit. It's a semi-noob illusion, as old as the hills. [1]
It's not authoritative, and I don't know dang's feelings on the matter, but it's probably worth noting that long ago the founder of the site clearly stated that it was acceptable to use downvoting to express disagreement:
pg: I think it's ok to use the up and down arrows to express agreement. Obviously the uparrows aren't only for applauding politeness, so it seems reasonable that the downarrows aren't only for booing rudeness.
For completeness, here's the primary moderator 'dang' very explicitly confirming that downvoting for disagreement is still allowed at least as of 2 years ago:
dang: Downvoting for disagreement has always been ok on HN.
I don't care about the imaginary internet points, but it doesn't take many downvotes to bury/kill/whatever a post. If a few early responders "downvote disagree" then nobody will ever see it.
It's almost as if they thought, "How can we best encourage groupthink?" and this was the answer.
And yet, years after Dan took over as moderator, and over a decade after Paul posted his comment, the site is still going strong, and arguably is one of the best places for discussion on the internet. There is clearly a tension between allowing people to "bury" unfavorable opinions and denying them the ability to express themselves at all, but somehow it mostly works.
One thing I'd recommend, if you haven't done so already, is to at least occasionally browse with "Showdead" turned on (accessible through your profile link in the upper right). The majority of the things that are dead deserve to be, but the others can often be rescued by vouching for them.
It also may help if you think of voting as "rearranging the page" rather than killing an opinion. The opinions are still there for those who wish to upvote them, they are just at the bottom. Like the dead comments, most of the things greyed out at the bottom deserve to be there---but the rest one can try to rescue with an upvote.
And yet, years after Dan took over as moderator,
and over a decade after Paul posted his comment,
the site is still going strong, and arguably is
one of the best places for discussion on the internet.
HN is a success and does the vast majority of things well. But it makes no sense to sweep aside criticism with "well, it's working." It succeeds because of the things it does well and in spite of the things it does suboptimally.
HN moderation generally works well in spite of the actual technical implementation; it succeeds because HN has a high-quality readership who generally follows the internet etiquette of "downvote posts that harm the discussion, not simply because you disagree."
One thing I'd recommend, if you haven't done so already,
is to at least occasionally browse with "Showdead" turned on
(accessible through your profile link in the upper right).
The majority of the things that are dead deserve to be,
but the others can often be rescued by vouching for them.
Amen! I wholeheartedly agree and I do that for this very reason.
Interesting info, but overly opinionated. I tried turning off font smoothing just now but the lack of boldness made everything appear dimmer, and I had to strain my eyes a bit when reading my code. Also I use my MacBook without a separate screen, with no scaling, so I really have to disagree with this:
> This will make everything on the screen slightly bigger, leaving you (slightly!) less screen estate. This is expected. My opinion is, a notebook is a constrained environment by definition. Extra 15% won’t magically turn it into a huge comfortable desktop
For me a notebook is not a constrained environment, and screen scaling very much makes the difference between "a huge comfortable desktop" and not.
This post is from the author of Fira Code, a font with ligatures that is incredibly pleasant to use. Highly recommend to give it a try if you haven't already!
Fira Code is a big job (with so many ligatures) and working on it obviously takes great care, but let's not pretend that the same author is responsible for the font as a whole - unless you know something about Fira Mono that I don't?
Thanks for the heads-up. Personally, I'm not a huge ligature fan in my code as my eyes tend to go directly to them instead of the code I want to read. It's great that Fira Code comes with a non-ligature version.
You're welcome! I can see how that could be a distraction for some. Personally, I enjoy the aesthetics. And as a bonus, I always get people intrigued when I show someone the code with this font that they never saw before.
Nonetheless, in my opinion it's still an awesome monospace font even without the ligatures.
I used to long for the days of 200dpi monitors when I was younger and we lived with 72dpi or less on-screen. Now, my vision has deteriorated to the point where I honestly can't see the difference despite being a type nerd. I'm increasingly using the zoom feature in MacOS to read smaller UI elements. I suspect my days of driving my MacBook display at "more space" (2048x1280) instead of "default" (1792x1120) are numbered.
It's frustrating that consumer Hi-DPI display development has completely stalled in the last five years.
As mentioned in the article, macOS works best using true 2x integer scaling. Using 2x scaling, a 4k monitor will result in an equivalent workspace to a 1080p display—unusable for productivity and development. A superior option is iMac/LG 5k display results which nets an equivalent workspace of 1440p.
The only options for greater-than-4k displays on the market currently are the 27" LG 5K ($1200, 2014) and the Dell 8k ($3500, 2017). I'm convinced this is due to the manufacturers focusing their attention on the high-margin gaming market which has no need for resolutions higher than 4k.
I'm holding onto my 30" 2560x1600 Apple Cinema Display until I can get something that offers me the equivalent amount of screen real estate at 2x retina scaling.
> an equivalent workspace to a 1080p display—unusable for productivity and development.
I have to disagree here. I use (and have used) a 1080p display for years and don’t feel the need for anything larger. That might be an artifact of my early years on 80x25 character displays, though.
Not to take away from your point, but Doom was actually written on a NeXTstation and cross-compiled for PCs, so at a minimum they were using 1120x832 resolution (2 bit greyscale).
I use a Pro Display XDR for programming / design work, I think you forgot this one. It seems like it would fit your needs (with the obviously premium price) as it’s 32”.
It’s overkill for most, but still the best display I’ve ever used and I love it. I actually tried an Asus 4K 120hz for much less but the fan noise was a deal breaker even if the screen didn’t break on arrival (it did).
I think the Pro Display should be disqualified for the price :). Like with the new Mac Pro Apple made the dream Display/Machine for professional video artists with high-paying customer. As a price, they completely neglegted their power user base which requires something different than an iMac.
I can only hope that either Apple or Dell makes a more basic version of that screen based on the same panel. And of course, on a high end screen I would expect more than one input. You might want to connect your laptop to your screen too...
Alternatively, I am hoping for an iMac with a larger screen.
I'm impatiently waiting for a slightly less overkill 6K 32" option, it's obviously the best size + resolution for Mac.
My hope is that someone will take the panel and pair it with a ~700nit backlight (vs 1k sustained 1.5k peak for XDR) and price it around $2k. Also interested to see what Apple has in store for the iMac, if the refresh comes with a new and improved display I may just go in that direction.
Color accuracy is far more important to me than 120hz capable. The faster displays today tend to be subpar at color accuracy, illumination uniformity, etc.
As someone who also stares a monitor for work have you noticed any difference in eye strain or comfort? I have two 4K monitors but considering going to 5K+. I liken it to shoes or a mattress. If we spend all day with X object it, then quality should be a priority.
No difference to eye strain or comfort vs the LG 5k I used before, although a major improvement vs my old 1440p monitor. Width makes a good improvement in Xcode and the color is very accurate to the iPhone screen which helps. I prefer one big monitor now I have this.
I agree with your analogy, literally my livelihood feeds through this screen, and I find a good monitor lasts a long time.
Same here. I'm using my Dell 3008WFP (also 30" @ 2560x1600) until something genuinely bigger & better comes out. I bought it over a decade ago and it still works great for programming.
I still use mine at work, and am extremely happy with it. The font rendering on my iMac 5k is better of course, but for work, the additional screen estate (especially vertically) is hard to beat. I am of course contemplating the 38" Dell with 3840x1600 :).
On the other side of things, running dual 4k displays for anything remotely intensive (a browser, IDE, instant messaging, mail client, and terminal in my case) can bring some machines to a crawl or run the fans at high speeds indefinitely. I think graphics technology has to catch up to hiDPI before I'd go past 4k.
And let's not even get into the fact that 4k 60hz is a struggle and a half -- you need HDMI 2.0 or displayport to make it work at all, and many modern laptops lack either port, forcing you to use USB-C/thunderbolt with adapters that have poor spec support and even worse documentation.
Even if someone offered me dual 5k displays to replace my dual 4k ones, I would turn them down right now. Before we move on to 5k displays, we need to get 4k60 down. And while I'm on my soapbox... where the hell is 4k90 or 4k120? I only bought my 4k monitors last month, with virtually unlimited budget, but they just don't exist. I imagine they've fallen victim to the same issue: the cables to support it just barely exist, and certainly can't be doubled up for dual monitor setups without a mess of wires plugged directly into a laptop or desktop.
If you use a Mac, a 5k monitor with integer scaling (2x) will take fewer GPU resources to run than a 4K with fractional scaling (1.5x). This why I refuse to switch to a 4K monitor for now. Will do the leap to a 5k monitor but the options are very slim now.
I'm also hanging on to my 30" Apple Cinema (and adapters to work with a new USB-C MBP). The LG 5K seems like a decent replacement though, I'm not totally clear why it seems so unpopular.
I've been using two LG 5k monitors for years and they are excellent. They got a bad reputation when they were released for build quality issues, but I haven't had a single issue.
Why is everyone so attached to scaling? What is this shit software that 1) everyone relies on and 2) whose UI and font size can't be adjusted like any normal software (or web page) since the 90s? Is this yet another brainfart from Apple?
Windows has great High DPI support. It's Windows applications that often haven't seen upgrades since before even 2012. Windows' commitment to backward compatibility is the struggle. Windows hasn't had the option to just change processor architectures every dozen years on a seeming whim and subsequently force all software to be rewritten or die.
macOS X was released in 2001 with the "Cocoa" application UI library. While "Carbon" helped older Mac OS applications run for a limited time on OS X, no applications today use any graphics stack older than Cocoa's 2001 release, as the drop of all support for PPC-targeted apps insured that in macOS' switch to x86. (Cocoa had High DPI support baked in from NextStep, even if it took ~11 years to be "needed".)
Win32 was first beta tested by developers in 1992, and has had to remain stable and backwards compatibile to those first early 90s versions. There are still 32-bit apps written in 1992 that are expected to run unmodified on Windows 10 today. The last processor-architecture related API drop that Windows has been allowed by public perception and corporate strategy was Win16 support was dropped on 64-bit processors. (Hence why Windows on ARM has struggled and the current iteration of Windows on ARM now involves a 32-bit x86 emulator as a core feature.)
Mac apps did have to be upgraded, it's just that Apple has been much better at requiring upgrades. There's barely no comparison here. There is no way that you can possibly find today a version of macOS that still supports Mac Classic applications unmodified from the 90s (for instance 94's Glider Pro v1) with High DPI support, yet Windows absolutely must run applications from Windows 95. Sure, Windows sometimes still stumbles in High DPI support for pixel-perfect applications written three decades ago in the 90s, but it at least tries, macOS shrugged and gave up.
Scaling just works greatly as Apple implements it. Especially, if your screen has a 200ppi or larger. Trying to change the font sizes across all applications is a sysiphus task and eventually you end up with one, which doesn't behave greatly. Especially, if one of the applications is Linux running in a VM. I have a Dell 24" 4k - and for 4k at 2x, the screen should be about 22". But with macOS I can set it to a virtual resulution of 2300, which works great.
On my 15" MB Pro, I usually use the 1680 resolution, but if needed, I can quickly change it to 1920.
Been developing software for 30+ years, with up to 2 extra monitors, and I always go back to a single display. I'll put myself up against anybody with multiple displays. The thing is, I just alt-tab (or otherwise) to different things - it's not like I can actually perceive more than 1 window at a time, for the most part. The head twisting to see multiple displays bugs me, as does the eye movements across a huge display. Plus there is just more opportunities for distractions with more display area.
Finally, I work most effectively when I hold what I need in my head, and find that multiple displays fights against that need.
The only real exception to that is if I'm writing front end code of some sort, having the live updates helps a ton.
Lastly, if I could have a bigger display on my laptop, I'd be up for that!
I used to feel this way. I found it depends on the work.
A lot of my old work didnt depend heavily on looking at a terminal output constantly.
These days, i have to have about 3-4 things running. The extra screen just becomes a log window.
You can argue the software should be redesigned, i do too, but the problem for me remains and a second screen while not solving it, alleviates it a lot.
It might have something to do with using a twm, might not.
All i know is that i dont miss the days of tab cycling and desktop flipping.
This sounds like when Jeff Atwood started that fad that if you didn't have three external monitors (yes, three) then your setup was suboptimal and you should be ashamed of yourself.
No. Just no. The best developers I've known wrote code with tiny laptops with poor 1366x768 displays. They didn't think it was an impediment. Now I'm typing this on one of these displays, and it's terrible and I hate it (I usually use an external 1080p monitor), but it's also no big deal.
A 1080p monitor is enough for me. I don't need a 4K monitor. I like how it renders the font. We can argue all day about clear font rendering techniques and whatnot, but if it looks good enough for me and many others, why bother?