I had the same realization maybe 7 or 8 years ago and it is true for me. With multiple monitors it's like moving your hands away from the keyboard all the time and looking for the mouse. Same happens for me while looking for anything across the 2 or 3 monitor setup. It just weakens your focus... If I'm on a laptop (and I usually am), I just use the laptop's screen. Getting used to a different screen size while working with my laptop setup would also slightly slow me down when I'm bound to use only the laptop.
When I started development 15" monitors were standard and 17" were a luxury. Setting multiple monitors was a must or you would have to constantly switch between editor, terminal and documentation.
If you look at the setup that's exactly what is happening. On a single screen there is browser with documentation (presumably), and there is four (!) IDE/editor/terminal windows.
It's not about how many monitors you have but rather what you are doing with them. You can put Netflix as one of the windows and still get distracted on a single monitor setup or you can have bad sight like me and require two 27" monitors side by side so that I can magnify fonts and have documentation, IDE and terminal at the same time with no strain.
Just to clarify: there are two i3 containers side by side (each using 50%). The left one’s active window is a browser, the right one’s active window is Emacs, in which I have two buffers side by side at the top, and compilation mode and magit status side by side at the bottom.
Writing this comment, though, I realize that a better alt-tab switcher would be a godsend. I keep getting confused with window order sometimes and I'm not sure why, maybe I should write a better switcher.
EDIT: Turns out it's trivial, I love Linux:
Just right click on any title bar on a window, go to "Special Application Settings" -> "Arrangement and access" -> "Shortcut". You can manually type "F8" in the shortcut and it'll work just fine.
You can also go to "System Settings" -> "Window Rules" to manage all rules you've defined.
The idea was to be able to quickly select things to be made visible on both monitors.
The two monitors are named primary and secondary.
To select something to be visible on primary you would press the combination (for example alt+f1) and the selected desktop would be placed on primary. Whatever was on primary would be pushed to secondary.
To select both primary and secondary you would first press what you want on secondary and then what you want on primary.
The idea is that people typically have what I call "scenes". Scene is particular arrangement of windows on both monitors.
This means that you can quickly learn your "scenes" and get them to be visible very quickly, in a fraction of a second, without having to hard-code the scene in your config file.
Sorry if this is really well known I'm new to MS Windows as a user.
I use dual monitor, but have two virtual desktops giving me 4 "screens" to layout windows in. I keep the same window in the same place, but often have to use 3 docs at once and so would love a 3rd screen. Then I'd have reference material on 1), doc I'm responding to on 2), doc I'm writing in also on 2), citations on 3).
On Mac, Quicksilver with Triggers: https://qsapp.com/
I have also configured the window to always stay on top and to resize to a small window on the bottom right, so if I press escape while the window is focused, I get picture-in-picture.
Also, mpv exits when it's done, so it's a great way of watching a video in full screen on the second monitor at the press of a button.
It doesn't matter if I use 2 27" 1440 monitors or a 32" inch 4K monitor.
Personally, a single 27" 1440 monitor is the bare minimum I am comfortable with in a desktop.
So yes, while in some situations (like debugging a complicated app, or for some "standby" information) multiple monitors are a must, for general use it ends up being too distracting.
Not my case. I don't lay my stuff around. My apps are in the menu, running ones on dock (Mac or Linux) and when I do need to have few windows open at the same time (second one either being terminal or mobile emulator or whatever) I arrange them to fit the same screen. My smallest screen is 15" on my Thinkpad, 17" on MBP and an external is Dell 23". Had no issues so far with my workflow.
For dev work though, typically I need see multiple things simultaneously: input (IDE/editor), output (console/browser) and instrumentation (debugger etc.)
I'm one-monitoring these days (WFH) because my laptop can't output to more than 1 display without a docking station -- I had left mine at the office. The context switching is tiring to say the least. I'm making do by tiling my windows but with a 24" display, there's just not enough room.
I think the author's setup works because he's using a 31" monitor and a tiling window manager. Most window managers don't tile as nicely.
That said, there are tools like Terminator and tmux, which are always useful.
While you couldn't run a game with it, it worked great for work. They seem to have some embeded video card.
(mine is no longer availble . But this looks similar: https://www.newertech.com/products/vidu3dvia.php)
As you mention, I find using a laptop screen for development really painful now.
For many tasks, one screen is enough/better, but sometimes it makes things a lot easier when you can look at them without having to rearrange your windows.
 Imagine something like this: https://future99.en.alibaba.com/product/60091134705-80040621...
I have bad eyes, and use HUGE fonts and can basically only get 1 window per monitor...
so I have 3 monitors setup:
middle for editor
left for browser/docs
right for email, slack, irc
Portrait 1080x1920 22" Dell. Landscape 4K LG. Landscape 4K LG. Plus the macbook pro screen, but I hardly ever put stuff on it.
I'm not a fan of the LG monitors I got, model "27UD68-W" in 27 inch. The standby light is bright white and flashes slowly, you have to switch the screen off to stop it. Which is a huge #firstworldproblem, I know, but no other monitor I've used has done this.
I wish I had gotten a higher-refresh-rate screen with less than 4k resolution. Would really like to play some games at faster than 60fps.
I find using a dark themes for everything really helps...Maybe I'm returning to my roots, but I use a lot of Ambers, Greens and Yellows for good contrast without blinding myself. If your a VI(m) user, I found the 'elflord' colorscheme comes pre-installed on a lot of machines (just ':colorscheme elflord' to try it..) Its a decent starting point.
On the other hand, I never seem to have enough light on my workbenches. I've fallen in love with ultra-bright LED fixtures from Home Depot for work area lighting... Dam screws get smaller ever year!
(extendable magnetic 'wand' is awesome for finding them when you drop them...)
> I'm not a fan of the LG monitors I got, model "27UD68-W" in 27 inch.
I run 3 x LG 27UK650_600. Most of my gaming tends to be MMORPG stuff, so I don't need anything more then 60Hz. But I am wondering how they'll handle CyberPunk 2077
For what it is worth, not too long ago I came across the concept of “portable monitors”. Took a risk on a 4K 15” model and it ended up being decent quality. Skinnier than a #2 pencil and perfect size when propped next to my 15” MBP, without getting in the way.
edit: Almost forgot to mention this — it is powered via USB-C cable. One end to the monitor and the other into my 15” MBP (13” MBP may not have enough “juice” to power).
Have you looked at installing the patched pulseaudio-bluetooth modules to gain LDAC support? :)
Has anybody looked into how you'd make a DIY NFC tag for this?
As far as I can tell making a tag that announces speakers is straightforward.
I'm trying to go the other direction, where the laptop/PC/tablet doesn't support NFC pairing.
Good news is that there's some more activity towards upstreaming them; at least in my experience, they Just Worked(TM) as far as the system automatically selecting LDAP or AAC as needed.
Honestly: I tried i3 and it wasn't for me. The psychology of slightly overlapping windows (i.e. "just put this over here, out of the way but still visible" as a way of making a physical reminder for myself) is just too much part of my mental model to give up.
That works for me, but I understand it’s not for everyone :)
i3 is solidly in the "pry it from cold, dead hands" category.
I have one, also 31.5" which is 3840 x 2160, without any scaling, and the text is sharp as fuck. Given my experience the only thing I'd do if I bought another monitor given what I know now is to buy one same res but larger because 3840 x 2160 is a lot of pixels in a small space. A single black pixel on a white background from 60cm away is very, very close to invisible (just tried it).
Just like switching from FHD to 4K (on a laptop). You won't be blown away immediately when you first use it, until you go back to FHD and suddenly you realize how much nicer the 4K screen was.
I run a 43" 4K display for work an I wish it was 8K, because when switching from my 15" 4K laptop to the 43" 4K desktop setup suddenly the desktop monitor doesn't look that crisp anymore.
Meanwhile, Linux has made the lives of 4k display owners unlivable due to the lack of non-integer scaling. A 5k display with 200% scaling at 27" would help the situation, but we don't have that option.
Many times I have considered jumping on to the MacOS boat, just for the high dpi display availability & scaling support advantage.
I'm reading this thread using the LG 5K monitor with Linux, mostly working fine.
I used one of them for a few years.
A colleague of mine uses 2 HP Z27q 5K displays and is happy with them: https://support.hp.com/us-en/document/c04591534
The bad thing about the 8K is needing a $500 video card which also means it's not an option for laptop users (I guess you could with an eGPU
It is doable, but it is expensive, and it may or may not work properly. One example that I remember is one where some users who attempted the above setup reported that they need to unplug & re-plug the monitor after it sleeps.
Today a <$200 GTX 1650 can output 7680x4320@120Hz, or 60Hz over a single DP link.
And I don't see why a laptop with a proper GPU and TB3 dock couldn't drive it either.
- Chrome which you need to pass a command line arg with your (not necessarily integer) scaling factor
- emacs which just reads the dpi from xresources
- xmonad/xmobar which barely have user interfaces but where you can specify pixel sizes for fonts if they get it wrong.
- terminal emulator which I think just looks at xresources for dpi.
Maybe other programs don’t work so well.
I had no problems at all setting 1.3 scaling in monitor settings (Fedora/Gnome 3)
Personally, I prefer the 5K2K 34" screens though like LG 34WK95U or MSI Prestige PS341WU.
Newer 5K displays like the two I mentioned (including LG) do support Windows. You don't need any hacks as long as you have TB3 support.
There is also the ProLite XB2779QQS-S1 and Planar IX2790 if you want full 5K at 27".
If you sit really close to your monitor then whatever. Can't imagine it's good for posture or your eyes to be focusing so close so often.
I don't need high performance or gaming or 3D, just full resolution at 60hz - as he is using his - but three of them ...
Again, retina refers to the minimum resolution, at a given viewing distance, where individual pixels are not recognizable.
The part about the viewing distance is essential. As I sit at my desk, my monitor is about a full arms length away from me. When I use my phone, it's distance from my eyes is about 1/3 that of the monitor (ish).
Thus, the DPI required for my phone to have its pixels indistinguishable is a lot higher than what my monitor needs to achieve the same. So in reality, DPI is a fuzzier description of experience than 'retina' is, even if it is an annoying, trademarked, marketing buzzword (which it is, no arguments there).
And he didn't pay for slightly crisper text, he paid for higher productivity. At $100/hour that 25 to 50 cents is trivial to make up. Which is why devs should never, ever skimp on hardware.
On the other hand, it does require additional resources being wasted, both for producing the items -- monitors/GPUs indeed do have a long lifespan, which means that the previous ones would still work -- and increased energy used per hour. So saying that any $ spent on hardware can't be wasted for devs...
And whether it will actually result in increased productivity is a good question. Readability, eye strain, enjoyment all factor in. Plenty of studies that only focus on one aspect, so easy to cherry pick a conclusion...
Again, I don't believe this is more wasteful than a few spa days or a vacation, so good for the OP. My initial post was about how that works out for them (or others chiming in). I find myself not too affected by this, i.e. my last big jump was from a 21 CRT to a 24 inch Dell, especially when it comes to the simple shapes of monochrome fonts.
In all seriousness, I have great respect for Stapelberg and I thank him for posting this. I love seeing other dev's setups and learned a few things in his post.
Hopefully he understands that this is not a true backup....he’s basically created a more complex RAID 1 running two NAS...
For example a flood, a House fire, theft, or a power surge is probably going to lose/fry both then everything is gone. This is a perfect example of thinking you have something backed up but if you store it in the same place you just created a more complex RAID 1...
If you want a backup you need to store a copy in a different location preferably a different region because if you put it at your friends house down the road they are also likely to be hit by the same disaster (i.e flood, tornado, hurricane, war, etc.) as you are.
A backup to one computer is better than none.
A backup to two computers is better than one.
A backup to four computers in two geographic regions is better than two local.
There's no end to how good a person's backup can be. Everyone has their limit.
The redundancy is just for home, because I don’t want to wait for replacement hardware for my setup to work.
I’m also really liking the usb-c power delivery, so it’s one cable to the work laptop, and from there, it’s got a usb hub for the kB/mouse.
The killer feature is the built in KVM switch, so by switching the input to hdmi, I can switch the keyboard/moose to my mini and use it when I’m not at $day_job.
You just have to remind yourself to bring the critical eye so that you don't start wearing black turtlenecks believing it'll make you a business genius :)
I got an ErgoDox EZ last year, put some nice heavy kaihl box switches in it...now I just gotta find time to learn it (and dvorak/colemak).
Currently using a Vortex Cypher with the split spacebar, I tore down and rebuild - new switches, added leds (south facing mounts so they're useless but I figured I'd try).
and new keycaps.
I tried to love my Ergodox, but I've spent way too many years with the scooped keybed of the Kinesis.
I looked pretty closely at the UHH, but I think it would be the same issue as the Ergodox -- I'm too used to the scooped keybed of the Kinesis, which makes the reach between e.g. K and 8 short enough that I notice when it's not there on a flat keybed.
I should really try a kinesis sometime. Everyone that has one seems to love them...
The ergo will be something new for me ...
New layout and my first ortho-linear.
I only need relieve from clicking if I do it a lot (and than map it to the keyboard for my other hand for a week or so).
My progression has been the 2560x1440 Thunderbolt display to the Dell P2715Q, which has been my main monitor at work and home for the past 5 years.
I've tried to upgrade from the P2715Q three times now, and always end up going back to it because I'm dissatisfied by the quality of the other panels in comparison.
I want a larger screen that can support 4K, or even 5K, with comparable panel quality. I avoided the LG 5K display sold by Apple because I heard bad things about it, but never actually tried one. It's 27" as well, so I think it would still be too small, even if the panel quality and resolution are better.
I'm starting to think I should just get a TV and use that as my display, but I don't know which would best suit my needs for writing software. One nice thing about a TV is that it would be much easier to do returns/swaps if I need to play the panel lottery for something that will make me happy. I have never done this with a previous display, since I always bought second-hand from people who had already done that for me. The original owner of my P2715Q apparently returned theirs three times before they were satisfied.
I've toyed with the idea of getting an Apple Pro display, but I really don't want to spend more than $2000 on a monitor.
The UP3218K described in the article seems like it would be a good upgrade, but it definitely exceeds my price range. I would be open to spending that much on a monitor if I knew I wouldn't be replacing it for at least five years, but I can't know that for sure. Although funnily enough, that's what has happened with my P2715Q. It's the only piece of computing equipment I haven't replaced since I first started using it.
- Behringer UM2 USB Audio Interface
- Behringer Ultravoice XM8500 Dynamic Cardioid mic
This would give the upgradability of an XLR audio interface and a seemingly decent mic for about $125. I imagine once everything is plugged together, it similarly comes down to just plugging in a USB, and I'd be surprised if it wasn't plug-n-play on Linux. I don't know how well the audio quality would compare, though. I'd be interested to hear more about the challenges of getting a working setup with XLR audio gear that are alluded to in the post.
Scrolling through the size I see the recommendation of the Noctua NF-A12x25. Probably 3x what I paid for the fan that is being replaced, but it could be the only component in the case that stays during the next rebuild!
Just ordered a Be Quiet! 140 mm Shadow Wings 2 for the front and a 120 mm for the back.
From everything I've read, they're the quietest. And are cheaper. But might move less air, due to being limited to 900 rpm.
In general, yes, I find it worthwhile to invest money and effort into a faster experience, but I’m very sensitive to latency, so YMMV :)
I recently re-built my i7-3770 w/ SATA3 SSD into an Ryzen 3700x with NVMe and build times are just remarkably faster. I waited far too long.
If one’s been building machines in the past year or so, we’ve just had the biggest performance gains in CPUs in literally 10 years with the latest AMD CPUs. I went a long time with an E3-1230 but got really bored honestly after 5 years and built an i7-4790k ITX machine which wasn’t as big of a change as going from 8GB of RAM to 16 GB or to my first SSD. I got a 3900X this past year and the substantial performance gains are not a big enough reason for me to upgrade yet again compared to something like the new Ampere GPUs for even hobby machine learning purposes like mine. In the future a bigger reason for me to build again would be based around space efficiency reasons where a small case could fit on top of my desk and drive 4K AAA games, encode HEVC video day and night, and all with a 600w SFX PSU.
I recently built a desktop around the 3900x. I’m happy with the performance but coming from a decade on MacBooks. Feeling the FOMO about my choice now :)
I like your idea with the USB hub to switch between computers — I will probably adopt that.
I do experience the stuck-key issue referenced (annoying, but easily worked-around and infrequent), going to look into swapping the PCB as linked.
I’ve had greater success with the UHK in tenting configuration and as wide as my shoulders.
If you have a keyboard programmable with QMK you can make the upper layers work in hardware too though (you still need to decide to which OS layout you want to map it)
Will add this detail as one thing to check out, thanks!
I'll answer it literally: seeing what equipment other engineers use can give ideas for one's own serup. stapelberg is an effective engineer, so I'm glad he shared it.