Hacker News new | past | comments | ask | show | jobs | submit login
My 2020 desk setup (stapelberg.ch)
170 points by secure 11 months ago | hide | past | favorite | 155 comments

"I found that using only one monitor allows me to focus more on what I’m doing, and I don’t miss anything about a multi-monitor setup."


I had the same realization maybe 7 or 8 years ago and it is true for me. With multiple monitors it's like moving your hands away from the keyboard all the time and looking for the mouse. Same happens for me while looking for anything across the 2 or 3 monitor setup. It just weakens your focus... If I'm on a laptop (and I usually am), I just use the laptop's screen. Getting used to a different screen size while working with my laptop setup would also slightly slow me down when I'm bound to use only the laptop.

Except you have huge monitor and you lay stuff in multiple side by side windows which makes the whole argument very moot. And when you use laptop you are kind of forced to use single screen because multiple screens is inconvenient at best.

When I started development 15" monitors were standard and 17" were a luxury. Setting multiple monitors was a must or you would have to constantly switch between editor, terminal and documentation.

If you look at the setup that's exactly what is happening. On a single screen there is browser with documentation (presumably), and there is four (!) IDE/editor/terminal windows.

It's not about how many monitors you have but rather what you are doing with them. You can put Netflix as one of the windows and still get distracted on a single monitor setup or you can have bad sight like me and require two 27" monitors side by side so that I can magnify fonts and have documentation, IDE and terminal at the same time with no strain.

> If you look at the setup that's exactly what is happening. On a single screen there is browser with documentation (presumably), and there is four (!) IDE/editor/terminal windows.

Just to clarify: there are two i3 containers side by side (each using 50%). The left one’s active window is a browser, the right one’s active window is Emacs, in which I have two buffers side by side at the top, and compilation mode and magit status side by side at the bottom.

I have three monitors, but I only use one (a 24" one, the second is for when I want to watch some show or movie while doing stuff on the main one, and the third I basically never use). I have my apps always maximized/fullscreen in the main monitor and I alt-tab between them. I never could get into having multiple apps on-screen at the same time, since I can only focus on one.

Writing this comment, though, I realize that a better alt-tab switcher would be a godsend. I keep getting confused with window order sometimes and I'm not sure why, maybe I should write a better switcher.

If you’re on macOS https://contexts.co/

Thank you! That looks very nice, unfortunately I'm on Linux, but something like that is bound to exist.

Try i3 as a Window manager. Put every app in a separate single window, and instead of alt-tab you use, e.g: alt+1 terminal, alt+2 browser, etc... You can have as much workspaces as you need and assign applications to specific workspace. This way I always know where every app or category of app exists. And this mostly works on default i3 config, no learning curve.

I started off using Alt F1-F12 (like the console), but I've found zero problems using bare F1-F12 with no modifier.

That sounds interesting, I'll try that, thanks!

Does this compliment an application like Alfred?

Yes! I use both together. I love that you can configure it to not show minimized apps when switching.

I work the same way and having a keyboard shortcut set for each one of my Apps has been game changing for me. Worth trying out to see if it fits your flow:


That's a great idea, especially with my programmable keyboard, thank you. I'll have to see if KDE supports this, but I think it's likely.

EDIT: Turns out it's trivial, I love Linux:


KDE's window manager, Kwin, supports this functionality built it.

Just right click on any title bar on a window, go to "Special Application Settings" -> "Arrangement and access" -> "Shortcut". You can manually type "F8" in the shortcut and it'll work just fine.

You can also go to "System Settings" -> "Window Rules" to manage all rules you've defined.

That settings dialog is a lifesaver, thank you. I've been using devilspie2 to do the same, but now I can do it natively.

A better alt+tab switcher would be great! Why isn't that a thing? I'm imagining something where you could use alt+f1, alt+f2, alt+f3, etc. to switch to specific windows rather than just rotating between everything.

On Windows, you can effectively use Win+1-9 for that by pinning the applications to the taskbar. I use that all the time.

Long time ago I made a python script to manage my 2-monitor i3 setup.

The idea was to be able to quickly select things to be made visible on both monitors.

The two monitors are named primary and secondary.

To select something to be visible on primary you would press the combination (for example alt+f1) and the selected desktop would be placed on primary. Whatever was on primary would be pushed to secondary.

To select both primary and secondary you would first press what you want on secondary and then what you want on primary.

The idea is that people typically have what I call "scenes". Scene is particular arrangement of windows on both monitors.

This means that you can quickly learn your "scenes" and get them to be visible very quickly, in a fraction of a second, without having to hard-code the scene in your config file.

On Win10 you can use win+1, win+2, ... to select the apps pinned to the application bar, it launches them if they're not launched; if they are launched it switches between active windows. That sounds like what you're after.

Sorry if this is really well known I'm new to MS Windows as a user.

I use dual monitor, but have two virtual desktops giving me 4 "screens" to layout windows in. I keep the same window in the same place, but often have to use 3 docs at once and so would love a 3rd screen. Then I'd have reference material on 1), doc I'm responding to on 2), doc I'm writing in also on 2), citations on 3).

On Linux, I use jumpapp: https://github.com/mkropat/jumpapp

On Mac, Quicksilver with Triggers: https://qsapp.com/

If you can live with switching between different applications rather than windows, this is the default behavior on Windows 10 and Gnome 3 using win+1, win+2, ….

Yes, exactly! Or just a saner way of sorting the tab order would be nice. I'll look and see if there's one for Linux, there must be something...

This is what you do in Linux with i3.

Same here. I full window everything. And use my second monitor basically only for YouTube/Spotify. It’s really useful when I need to keep some reference up too.

Oh you're going to love this: There's an extension that launches custom programs with the current page as an argument, and I use it to launch mpv, which is then configured to open full-screen on the second screen with a high-quality stream of the YouTube video I'm on.

I have also configured the window to always stay on top and to resize to a small window on the bottom right, so if I press escape while the window is focused, I get picture-in-picture.

Also, mpv exits when it's done, so it's a great way of watching a video in full screen on the second monitor at the press of a button.

link to said extension? :)

I don't remember the name right now and I'm not at the computer, but anything that will launch a program will work, the extension itself does very little.

I just checked, the extension is called "Open With":


For me, it's not about having more monitors. It simply about having enough real state to be productive.

It doesn't matter if I use 2 27" 1440 monitors or a 32" inch 4K monitor.

Personally, a single 27" 1440 monitor is the bare minimum I am comfortable with in a desktop.

The main issue for me is finding in which monitor the focus/mouse is

So yes, while in some situations (like debugging a complicated app, or for some "standby" information) multiple monitors are a must, for general use it ends up being too distracting.

"Except you have huge monitor and you lay stuff in multiple side by side windows which makes the whole argument very moot."

Not my case. I don't lay my stuff around. My apps are in the menu, running ones on dock (Mac or Linux) and when I do need to have few windows open at the same time (second one either being terminal or mobile emulator or whatever) I arrange them to fit the same screen. My smallest screen is 15" on my Thinkpad, 17" on MBP and an external is Dell 23". Had no issues so far with my workflow.

True for writing or any one-mode tasks.

For dev work though, typically I need see multiple things simultaneously: input (IDE/editor), output (console/browser) and instrumentation (debugger etc.)

I'm one-monitoring these days (WFH) because my laptop can't output to more than 1 display without a docking station -- I had left mine at the office. The context switching is tiring to say the least. I'm making do by tiling my windows but with a 24" display, there's just not enough room.

I think the author's setup works because he's using a 31" monitor and a tiling window manager. Most window managers don't tile as nicely.

That said, there are tools like Terminator and tmux, which are always useful.

I bought one of those usb to dvi converters for a white polycarb macbook that only had one external monitor connector.

While you couldn't run a game with it, it worked great for work. They seem to have some embeded video card.

(mine is no longer availble . But this looks similar: https://www.newertech.com/products/vidu3dvia.php)

Thanks for the tip! This sounds like it might work for me.

I've come to love the ultrawide 34" format @3440x1440 for "I want things side by side". Most of these can work as multiple monitors and split the screen if you plug in two cables, but you'll obviously have zero gap between them.

Same, that's what I have been using for GIS work, media production, and some light coding. Only thing I regret is not being able to play Xbox games at 21:9 (unless you know of a way).

I don't have an Xbox, so unfortunately no. The only occasional gaming I do is on Windows, and even that's rare these days.

When I started at my current company they gave me a single 43" 4k monitor. It felt ridiculously huge at first, but now I'm not sure I could ever go back to anything smaller. At the distance I sit from it, the ~102 PPI is perfect for 100% scaling. I bought a 43" Samsung TV to use at home.

As you mention, I find using a laptop screen for development really painful now.

ooc, what is the distance you sit from it?

About 30 inches.

Actually, I like having two screens. However, I arrange them in vertical order and one is my primary screen and the other is like additional space I can use when I need it.

For many tasks, one screen is enough/better, but sometimes it makes things a lot easier when you can look at them without having to rearrange your windows.

Yeah, one big screen is great. Only my kitchen table isn't big enough to fit a large screen as I need to sit at the short end. Because the landlord decided it was great to buy one of those weird tables with a little storage table thingy [1] below it so I keep hitting it with my legs.

[1] Imagine something like this: https://future99.en.alibaba.com/product/60091134705-80040621...

I have the opposite problem...

I have bad eyes, and use HUGE fonts and can basically only get 1 window per monitor... so I have 3 monitors setup: middle for editor left for browser/docs right for email, slack, irc

Similar to you, I've had to increase text sizes gradually over the past decade. Currently I have 3 monitors, from left to right:

Portrait 1080x1920 22" Dell. Landscape 4K LG. Landscape 4K LG. Plus the macbook pro screen, but I hardly ever put stuff on it.

I'm not a fan of the LG monitors I got, model "27UD68-W" in 27 inch. The standby light is bright white and flashes slowly, you have to switch the screen off to stop it. Which is a huge #firstworldproblem, I know, but no other monitor I've used has done this.

I wish I had gotten a higher-refresh-rate screen with less than 4k resolution. Would really like to play some games at faster than 60fps.

>Similar to you, I've had to increase text sizes gradually over the past decade. Currently I have 3 monitors, from left to right

I find using a dark themes for everything really helps...Maybe I'm returning to my roots, but I use a lot of Ambers, Greens and Yellows for good contrast without blinding myself. If your a VI(m) user, I found the 'elflord' colorscheme comes pre-installed on a lot of machines (just ':colorscheme elflord' to try it..) Its a decent starting point.

On the other hand, I never seem to have enough light on my workbenches. I've fallen in love with ultra-bright LED fixtures from Home Depot for work area lighting... Dam screws get smaller ever year! (extendable magnetic 'wand' is awesome for finding them when you drop them...)

> I'm not a fan of the LG monitors I got, model "27UD68-W" in 27 inch.

I run 3 x LG 27UK650_600. Most of my gaming tends to be MMORPG stuff, so I don't need anything more then 60Hz. But I am wondering how they'll handle CyberPunk 2077

To to fair i3 is so good that having multiple becomes unnecessary. It's much faster to open a second or third window in the same screen than looking up to the other screen.

I share the same perspective.

For what it is worth, not too long ago I came across the concept of “portable monitors”. Took a risk on a 4K 15” model and it ended up being decent quality. Skinnier than a #2 pencil and perfect size when propped next to my 15” MBP, without getting in the way.

edit: Almost forgot to mention this — it is powered via USB-C cable. One end to the monitor and the other into my 15” MBP (13” MBP may not have enough “juice” to power).

Agreed. I’ve just dumped my second monitor as well. I’m now down to one Iiyama 22” 1080p unit. I’m considering replacing it with a larger 4K monitor but am demotivated to do this because I don’t want to futz with it all now.

Ditto with workspaces. A great idea, in theory, but all windows end up in one place eventually.

The WH-1000XM3 are just about the only recommended bluetooth headphone anymore that makes switching devices that much of a pain. Most will either automatically switch between a few active devices or allow you to switch by just initiating the connection from the target device.

Have you looked at installing the patched pulseaudio-bluetooth modules to gain LDAC support? :)

I use NFC pairing when it's available. Unfortunately most audio sources don't support it.

Has anybody looked into how you'd make a DIY NFC tag for this?

I don't know the specifics here but you can buy blank NFC tags and I used NFC Tools PRO for writing tags with my android phone, https://play.google.com/store/apps/details?id=com.wakdev.nfc... .


As far as I can tell making a tag that announces speakers is straightforward.

I'm trying to go the other direction, where the laptop/PC/tablet doesn't support NFC pairing.

Have not looked at any fancy pulseaudio setups. I don’t want to maintain them on my machines over time, as I use these headphones with 4 different devices :)

Fair enough! For other people coming across the thread, the repo for the modules is here (sorry for not posting this in the original post: https://github.com/EHfive/pulseaudio-modules-bt)

Good news is that there's some more activity towards upstreaming them; at least in my experience, they Just Worked(TM) as far as the system automatically selecting LDAP or AAC as needed.

"Probably another macbook setup" "Oh, nope, lots of terminals." "Hey, is that i3?" Then I realized who the author was.

Honestly: I tried i3 and it wasn't for me. The psychology of slightly overlapping windows (i.e. "just put this over here, out of the way but still visible" as a way of making a physical reminder for myself) is just too much part of my mental model to give up.

Have you tried stacking mode? The title bar will still be visible, while the window is out of the way. See it in action if you zoom into my monitor on https://michael.stapelberg.ch/Bilder/2020-05-22-desk-setup.j...

That works for me, but I understand it’s not for everyone :)

Thanks for creating i3, have been using it for years -- stacked tabs are indeed great, as is basically everything else in this wondrous WM.

i3 is solidly in the "pry it from cold, dead hands" category.

Also want to give you a big thank you for creating i3. After using it for a long time I can’t imagine going back to a setup without it. I almost always need 2 windows side by side and it makes that workflow so effortless.

Just another big thanks for i3, it has significantly changed my workflow for the better!

His monitor resolution is unnecessary AFAICS.

I have one, also 31.5" which is 3840 x 2160, without any scaling, and the text is sharp as fuck. Given my experience the only thing I'd do if I bought another monitor given what I know now is to buy one same res but larger because 3840 x 2160 is a lot of pixels in a small space. A single black pixel on a white background from 60cm away is very, very close to invisible (just tried it).

I guess it's one of those things that you need to use for a while to 'get it'.

Just like switching from FHD to 4K (on a laptop). You won't be blown away immediately when you first use it, until you go back to FHD and suddenly you realize how much nicer the 4K screen was.

I run a 43" 4K display for work an I wish it was 8K, because when switching from my 15" 4K laptop to the 43" 4K desktop setup suddenly the desktop monitor doesn't look that crisp anymore.

8K monitor sounded ridiculous until I realized that's basically Retina @ 32-34"

It is so sad that for us not in the Apple Ecosystem it is impossible to find a desktop HiDPI display. We are stuck in 4k for the past 5 years. This Dell is the only "option", but I don't know how many of us can afford spending $5k on a monitor with a lifetime of ~ 3 years that has not shown great reliability.

Meanwhile, Linux has made the lives of 4k display owners unlivable due to the lack of non-integer scaling. A 5k display with 200% scaling at 27" would help the situation, but we don't have that option.

Many times I have considered jumping on to the MacOS boat, just for the high dpi display availability & scaling support advantage.

> A 5k display with 200% scaling at 27" would help the situation, but we don't have that option.

I'm reading this thread using the LG 5K monitor with Linux, mostly working fine.

I have two 5K 27" displays with 200% scaling on Linux- perfect HiDPI setup. Lookup the Planar IX2790.

The Dell P2415Q would be my runner-up recommendation: https://www.dell.com/en-us/work/shop/dell-24-ultra-hd-4k-mon...

I used one of them for a few years.

A colleague of mine uses 2 HP Z27q 5K displays and is happy with them: https://support.hp.com/us-en/document/c04591534

HP and Dell made 5k monitors back in 2014, but they were both discontinued pretty fast.

Is there no way to use an LG 5K?

The bad thing about the 8K is needing a $500 video card which also means it's not an option for laptop users (I guess you could with an eGPU

There are some very hacky ways that involve special add-on cards for the motherboard, special motherboard model requirements, gpu connector requirements and even special software to adjust the brightness.

It is doable, but it is expensive, and it may or may not work properly. One example that I remember is one where some users who attempted the above setup reported that they need to unplug & re-plug the monitor after it sleeps.

The blog author is using a GTX 1060, that's a $400 video card that came out nearly 4 years ago.

Today a <$200 GTX 1650 can output 7680x4320@120Hz, or 60Hz over a single DP link.

And I don't see why a laptop with a proper GPU and TB3 dock couldn't drive it either.

He states he uses a 2070 now.

I’m fine using a hidpi monitor with Linux but this is probably because I only really use about three programs:

- Chrome which you need to pass a command line arg with your (not necessarily integer) scaling factor

- emacs which just reads the dpi from xresources

- xmonad/xmobar which barely have user interfaces but where you can specify pixel sizes for fonts if they get it wrong.

- terminal emulator which I think just looks at xresources for dpi.

Maybe other programs don’t work so well.

> Meanwhile, Linux has made the lives of 4k display owners unlivable due to the lack of non-integer scaling.

I had no problems at all setting 1.3 scaling in monitor settings (Fedora/Gnome 3)

This is just wrong. PCs work fine with 5K 27" displays like the LG Ultrafine 27MD5KA.

Personally, I prefer the 5K2K 34" screens though like LG 34WK95U or MSI Prestige PS341WU.

LG does not support non apple software/hardware for that monitor. You can hack your PC with TB cards and special motherboards that support these cards to show signal, sometimes. This is far away from "works fine".

Probably just because it's older.

Newer 5K displays like the two I mentioned (including LG) do support Windows. You don't need any hacks as long as you have TB3 support.

There is also the ProLite XB2779QQS-S1 and Planar IX2790 if you want full 5K at 27".

Yes, exactly! You get a lot of retina screen real estate that way :)

It's still ridiculous. The only people I see using these monitors to their full ability are sitting less than a foot away from the monitor. Even then, kinda overkill.

If you sit really close to your monitor then whatever. Can't imagine it's good for posture or your eyes to be focusing so close so often.

I'm curious - as a user of a triple-monitor desktop setup - is there a single video card that can drive 3 of these Dell 8k monitors he uses ?

I don't need high performance or gaming or 3D, just full resolution at 60hz - as he is using his - but three of them ...

May have to wait for the next generation nvidia cards that may come with DP 2.1(?). That'll potentially let you drive 4x8k screens with just 4 cables.

...as if "retina", a marketing term, meant anything in particular other than "higher-than-usual density".

It does, actually. It’s supposed to describe a display where the individual pixels aren’t discernible at normal viewing distance

I prefer numbers and measurement units to trademarked, fuzzy descriptions of experiences. Since we're talking about display density, DPI is the term I want to see used.

That's all well and good, you're entitled to your preferences. But it doesn't change the fact that it's not a 'fuzzy' description of experience. DPI alone isn't enough information to describe what it is meant to define.

Again, retina refers to the minimum resolution, at a given viewing distance, where individual pixels are not recognizable.

The part about the viewing distance is essential. As I sit at my desk, my monitor is about a full arms length away from me. When I use my phone, it's distance from my eyes is about 1/3 that of the monitor (ish).

Thus, the DPI required for my phone to have its pixels indistinguishable is a lot higher than what my monitor needs to achieve the same. So in reality, DPI is a fuzzier description of experience than 'retina' is, even if it is an annoying, trademarked, marketing buzzword (which it is, no arguments there).

Is he me? Same CPU, CPU fan, SSD, monitor, RAM, case, WM, linux distro.. Except I never made the switch to vim from emacs.

In all seriousness, I have great respect for Stapelberg and I thank him for posting this. I love seeing other dev's setups and learned a few things in his post.

That's a lot of money between screen and GPU just to get crisper fonts (as the rest of the GUI is scaled x3 and thus you don't get that much more screen real estate). Is this really worth it for you? I regularly switch between a 5k iMac and regular 1920x1200 display, and sure, I can see pixels, but I don't find myself caring that much.

Considering how many hours a day he likely uses his machine, it may work out to something like the difference between a few cents and 10 cents an hour.

That's only going to be true if the $4000 extra they spent is used for 40,000+ hours. It's likely closer to be a dollar an hour more for slightly crisper text. (Assuming they use it full time for 2 years)

The monitor/GPU have a lot more than 2 year lifespan. Depreciation is likely less than 50% in 2 years.

And he didn't pay for slightly crisper text, he paid for higher productivity. At $100/hour that 25 to 50 cents is trivial to make up. Which is why devs should never, ever skimp on hardware.

Okay, first of all let me say that I don't want to poo-poo anyone's choice of environment. If it "sparks joy" for hours every day, it's probably a good personal investment.

On the other hand, it does require additional resources being wasted, both for producing the items -- monitors/GPUs indeed do have a long lifespan, which means that the previous ones would still work -- and increased energy used per hour. So saying that any $ spent on hardware can't be wasted for devs...

And whether it will actually result in increased productivity is a good question. Readability, eye strain, enjoyment all factor in. Plenty of studies that only focus on one aspect, so easy to cherry pick a conclusion...

Again, I don't believe this is more wasteful than a few spa days or a vacation, so good for the OP. My initial post was about how that works out for them (or others chiming in). I find myself not too affected by this, i.e. my last big jump was from a 21 CRT to a 24 inch Dell, especially when it comes to the simple shapes of monochrome fonts.

Yes, I find myself caring a lot, hence the expensive gear :)

“ For redundancy, I am backing up my computers to 2 separate network storage devices.”

Hopefully he understands that this is not a true backup....he’s basically created a more complex RAID 1 running two NAS...

For example a flood, a House fire, theft, or a power surge is probably going to lose/fry both then everything is gone. This is a perfect example of thinking you have something backed up but if you store it in the same place you just created a more complex RAID 1...

If you want a backup you need to store a copy in a different location preferably a different region because if you put it at your friends house down the road they are also likely to be hit by the same disaster (i.e flood, tornado, hurricane, war, etc.) as you are.

TFA explains why he didn't use RAID1: "I put in one hard disk per device for maximum redundancy: any hardware component can fail and I can just use the other device.". If the motherboard dies on one device, he still has a working device.

A backup to one computer is better than none.

A backup to two computers is better than one.

A backup to four computers in two geographic regions is better than two local.

There's no end to how good a person's backup can be. Everyone has their limit.

Yes, I understand. I have an off-site backup for that reason, in a different part of the city: https://michael.stapelberg.ch/posts/2018-01-13-offsite-backu...

The redundancy is just for home, because I don’t want to wait for replacement hardware for my setup to work.

That’s definitely an improvement and a cheap solution. I always have to remind people of offsite backups... I had to learn the hard way and I’ve now migrated to Backblaze for one of my backup copies for my cold data .

I’m a couple of weeks into using a single giant (32”) monitor at macOS’s 3k resolution, and I’m finding it works pretty well with a 3x2 set of windows for code, terminal and docs. The center third is usually a single emacs window with two frames, web browser on the left, and inspect window or terminals on the right.

I’m also really liking the usb-c power delivery, so it’s one cable to the work laptop, and from there, it’s got a usb hub for the kB/mouse.

The killer feature is the built in KVM switch, so by switching the input to hdmi, I can switch the keyboard/moose to my mini and use it when I’m not at $day_job.

What monitor do you have?

Dell U3219q.

Love these threads, it is always nice to see the setups of productive persons.

Absolutely! Every now and then you pick up some methods or tools you hadn't thought of, that make yourself better.

You just have to remind yourself to bring the critical eye so that you don't start wearing black turtlenecks believing it'll make you a business genius :)

Why Kinesis doesn't produce higher quality build variant of its keyboard? One that doesn't feel like a flimsy/cheap plastic. It seems they are charging premium just for the layout and PCB.

I don’t think the advantage feels flimsy. Certainly I wouldn’t want to be hit around the head with one (but maybe this is a bad metric for “flimsy”). I think the premium is largely for the difficulty of putting all those keys in that 3d layout. I think the old advantage 1 had dome switches and was quite a similar price to the cherry-switch advantage 2. The maltron which also has a bowl shaped layout is also expensive. But also the only bit of the body of the keyboard you interact with is the rest for the heel of your hand and often people put something on these anyway (e.g. gel pads, tape to reduce slipping, ...) so it seems bad to optimise for it.

I got one of their 'freestyle' keyboards years ago.. complete crap - I swear its rubber dome switches :-P

I got an ErgoDox EZ last year, put some nice heavy kaihl box switches in it...now I just gotta find time to learn it (and dvorak/colemak).

Currently using a Vortex Cypher with the split spacebar, I tore down and rebuild - new switches, added leds (south facing mounts so they're useless but I figured I'd try).

and new keycaps.

There's a HUGE difference between the Kinesis Advantage and the Freestyle line. The Advantage2 uses MX Browns.

I tried to love my Ergodox, but I've spent way too many years with the scooped keybed of the Kinesis.

btw - have you seen/tried anything like the 'tractyl' [0] or ultimate hacking keyboard[1] ? I used to use a trackball full-time, and I'm wondering how well a lil 'thumball' works...


[1] https://ultimatehackingkeyboard.com/

Tractyl looks pretty neat, but I think I'd miss my thumb keys too much.

I looked pretty closely at the UHH, but I think it would be the same issue as the Ergodox -- I'm too used to the scooped keybed of the Kinesis, which makes the reach between e.g. K and 8 short enough that I notice when it's not there on a flat keybed.

You could have the trackball on one thumb and the thumb cluster on the other... I have a couple of crooked fingers from breaks that didn't heal completely straight, and I think I'm getting arthritis. So I'm trying to maximize usage of my 'strong' fingers.

I should really try a kinesis sometime. Everyone that has one seems to love them...

Yeah - People really seem to like the Advantage. I just wasn't sure if I could get used to the scoop... In hindsight, I should have just tried it.

The ergo will be something new for me ... New layout and my first ortho-linear.

I got ergodox few months ago. Still getting used to it. I feel like this other ken keyboard promotes typing with your hands rested. ASFAIK this can be a culprit for RSI

What's the Raspberry Pi for? You put a Raspberry Pi in the picture and you don't tell us what it's for. That's like total cruelty. ;-)

I use that for porting https://gokrazy.org/ to the Raspberry Pi 4 (right now) and doing other Raspberry Pi development in general :)

I second the Logitech MX Ergo "thumb ball mouse". Once you're transferred your mouse motor skills to this (takes about two hours) you'll never go back to a moving mouse. Especially, if you've got a sore shoulder, or elbow or wrist do yourself a favor and try one of these. (Full Disclosure I own no stock in Logitech, just a very happy customer of this product.)

Thirded :). I've switched (to its predecessor) over 10 years ago and never looked back. I even works perfectly well with hectic games after a bit of practice. Additionally I'm using keyboards without numpad to further reduce the distance reaching the "mouse".

My thumb is glad to get rest from the phone I think :)

I only need relieve from clicking if I do it a lot (and than map it to the keyboard for my other hand for a week or so).

Given that this is the closest thing to a monitor discussion thread I've seen on HN, I'd like to ask if anyone has a recommendation for what I should upgrade to next.

My progression has been the 2560x1440 Thunderbolt display to the Dell P2715Q, which has been my main monitor at work and home for the past 5 years.

I've tried to upgrade from the P2715Q three times now, and always end up going back to it because I'm dissatisfied by the quality of the other panels in comparison.

I want a larger screen that can support 4K, or even 5K, with comparable panel quality. I avoided the LG 5K display sold by Apple because I heard bad things about it, but never actually tried one. It's 27" as well, so I think it would still be too small, even if the panel quality and resolution are better.

I'm starting to think I should just get a TV and use that as my display, but I don't know which would best suit my needs for writing software. One nice thing about a TV is that it would be much easier to do returns/swaps if I need to play the panel lottery for something that will make me happy. I have never done this with a previous display, since I always bought second-hand from people who had already done that for me. The original owner of my P2715Q apparently returned theirs three times before they were satisfied.

I've toyed with the idea of getting an Apple Pro display, but I really don't want to spend more than $2000 on a monitor.

The UP3218K described in the article seems like it would be a good upgrade, but it definitely exceeds my price range. I would be open to spending that much on a monitor if I knew I wouldn't be replacing it for at least five years, but I can't know that for sure. Although funnily enough, that's what has happened with my P2715Q. It's the only piece of computing equipment I haven't replaced since I first started using it.

Rtings has comprehensive reviews for TV as monitors: https://www.rtings.com/tv/reviews/best/by-usage/pc-monitor

I prefer the multi monitor setup and use at least one in portrait/vertical mode. Fits with my ide and jupyter notebooks

Similar setup - I just throw my terminal full screen on a vertical monitor (w/ GNU Screen) and don't find it distracting in any way.

I am also of the opinion of needing only one monitor because I can only focus on one thing at once. Whatever is not the work I am doing when I focus, goes into another desktop space. I am also a power user of an OSX window manager called Spectacle. The tools and practices that help me focus are key.

I've always been curious about higher-end USB microphones (the Rode is ~$230) compared with a potentially more upgradable XLR setup like:

- Behringer UM2 USB Audio Interface

- Behringer Ultravoice XM8500 Dynamic Cardioid mic

This would give the upgradability of an XLR audio interface and a seemingly decent mic for about $125. I imagine once everything is plugged together, it similarly comes down to just plugging in a USB, and I'd be surprised if it wasn't plug-n-play on Linux. I don't know how well the audio quality would compare, though. I'd be interested to hear more about the challenges of getting a working setup with XLR audio gear that are alluded to in the post.

On switching peripherals between both work and home computers, I've been using for a few years now a UGREEN USB Switch. It's around £21 for the USB 2 version at the moment in the UK. So it's basically what the author uses but you connect both machines there and click a button to toggle between machines. On mine there's 4 ports only but maybe that's enough for most people - I'm only using half.

One minor benefit of TFA to me this morning is I had a 120mm case fan that has been vibrating / buzzing, but I've been delaying buying a replacement just due to not wanting to choose the wrong SKU (again).

Scrolling through the size I see the recommendation of the Noctua NF-A12x25. Probably 3x what I paid for the fan that is being replaced, but it could be the only component in the case that stays during the next rebuild!

I just built a system, but the included case fans are too loud.

Just ordered a Be Quiet! 140 mm Shadow Wings 2 for the front and a 120 mm for the back.

From everything I've read, they're the quietest. And are cheaper. But might move less air, due to being limited to 900 rpm.

https://www.bequiet.com/en/casefans/1625 https://smile.amazon.com/gp/aw/d/B07MCHLGC5

How much performance are you able to get out of your router7 setup? If it's unable to saturate the gigabit, do you know where the bottleneck is?

It effortlessly satisfies a full gigabit, otherwise I wouldn’t stick with it :)

Do you have any experience with using it on "Crossover7" which I think is somehow the slightly less preferred offer from fiber7 that I have?

I do not have any experience with it. My expectation would be that the IP network quality itself is good (provided by init7), and hopefully the underlying platform is stable and well-provisioned enough.

My 2020 desk setup: a cushion, an empty book block and a board both in wood as support for my laptop. Haven't used a chair or a desk since ages

I often get asked about my equipment too and wrote up my bill of materials. Surprised you're not using LED lights. Try them, you'll enjoy the upgrade. https://medium.com/p/bill-of-materials-for-my-home-office-f7...

The hardware section makes it sound like he's rebuilding his computer multiple times a year. Do people really find upgrading that often is worth the cost and the bother? What sorts of workloads do people deal with that having top of the line hardware even matters?

I rebuild once a year on average, but not every year (planning to skip this year, for example).

In general, yes, I find it worthwhile to invest money and effort into a faster experience, but I’m very sensitive to latency, so YMMV :)

There's a sweet spot somewhere in there.

I recently re-built my i7-3770 w/ SATA3 SSD into an Ryzen 3700x with NVMe and build times are just remarkably faster. I waited far too long.

Some folks like having shiny hardware and it’s partly a hobby rather than a need in almost certainty barring specific niche professional needs. I upgrade my workstation setup quite rarely because my work provided laptop is what I use 90%+ of the time. Top of the line is one thing (Xeons and Threadrippers or EPYC) but pretty powerful cost effective CPUs like a 3900X or 9900k are another. Even though I do ML stuff occasionally I still do it as a useless hobby and it’s not worth bumping up to a Threadripper or a crazy GPU, especially when I can use a cloud machine for a fraction of the price and as a bonus force myself to make it somewhat repeatable and deployable for someone else.

If one’s been building machines in the past year or so, we’ve just had the biggest performance gains in CPUs in literally 10 years with the latest AMD CPUs. I went a long time with an E3-1230 but got really bored honestly after 5 years and built an i7-4790k ITX machine which wasn’t as big of a change as going from 8GB of RAM to 16 GB or to my first SSD. I got a 3900X this past year and the substantial performance gains are not a big enough reason for me to upgrade yet again compared to something like the new Ampere GPUs for even hobby machine learning purposes like mine. In the future a bigger reason for me to build again would be based around space efficiency reasons where a small case could fit on top of my desk and drive 4K AAA games, encode HEVC video day and night, and all with a 600w SFX PSU.

I only have a old ThinkPad x230 to discourage me from spending lots of time on computers but that does not work.

Interesting about switching from the 3900x back to Intel.

I recently built a desktop around the 3900x. I’m happy with the performance but coming from a decade on MacBooks. Feeling the FOMO about my choice now :)

Cool thanks for sharing. My work from home setup has a legal marijuano vaporizer within reaching distance to help deal with the bullshit from managers & other bumbos in our digital corp.

Huge fan of i3. It also allowed me to reduce down to one monitor. I use a MSI Optix 34” curved monitor.

I like your idea with the USB hub to switch between computers — I will probably adopt that.

That keyboard looks weird - maybe its because I haven't used it myself. Anyone used it? Can you share your experience?

I’ve used a Kinesis Advantage at work for 7 years now. Started after I began to get wrist discomfort once I started programming full-time. It’s been wonderful, and removal of the discomfort and pain is worth the $300 several times over. I bought one for home shortly after. It does take a bit to get used to typing on. It took a few days to get from 6 wpm to ~35 wpm, and about two weeks total to get to ~80 wpm. I can usually type a bit faster on a “standard” laptop keyboard, but 80 wpm is still plenty fast. I’m a Vim user, so I mapped the End key to Esc (keyboard has hardware mapping built-in).

I do experience the stuck-key issue referenced (annoying, but easily worked-around and infrequent), going to look into swapping the PCB as linked.

I wish there was a way to try this keyboard without committing to buying it for a month or so. It seems like there will be a learning curve and you might not even like it at the end of it.

There is a way! Kinesis has a 60 day return policy if you buy directly from them.

[0]: https://kinesis-ergo.com/support/returns/

Indeed. I was fortunate enough to have a coworker who had one and let me try it while they were on vacation.

I had a very hard time getting used to the matrix (“ortholinear?”) layout of the keys. I think this is because I don’t touch type “the right way.”

I’ve had greater success with the UHK in tenting configuration and as wide as my shoulders.

My buddy has one he's used for ages..5-10 years at least. He swears by it. Definitely a learning curve though.

About the slow package managers complaint, you should try out Void Linux! Xbps is god speed compared to apt or pacman

Thanks, I’m aware. Unfortunately, Void Linux runs runit as init system, and I want systemd.

yay, a neo user.

If you have a keyboard programmable with QMK you can make the upper layers work in hardware too though (you still need to decide to which OS layout you want to map it)

Yeah, I wanted to look into QMK for a while now, it’s on my list :)

Will add this detail as one thing to check out, thanks!


If it's not interesting, move on.

I'll answer it literally: seeing what equipment other engineers use can give ideas for one's own serup. stapelberg is an effective engineer, so I'm glad he shared it.

It’s interesting because the author is a software artisan, like many of the people who are responding in this thread. Personally I love reading about the tools used by other people in the same trade. The choice of tools highly influences ones productivity, which is probably why I find it fascinating to read about his choices.

for me, it is the most interesting thing on the front page right now.

How is this comment not dead/flagged?

I read this as “Steven Spielberg uses this: my 2020 desk setup” before I clicked through. I was so amazed that Spielberg used a tiling window manager before I went back and figured out my mistake!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact