Hacker News new | past | comments | ask | show | jobs | submit login
Dual 75“ 4K TV Floor Computing (reddit.com)
335 points by walterbell on March 27, 2022 | hide | past | favorite | 276 comments



I am tempted to do this, at least as a FU to the monitor industry. There is so little innovation in monitor tech! Why are monitors so expensive and old tech compared to TVs? I just want an affordable large monitor with 120Hz refresh rate and USB-C for coding, but options are quite limited. All monitor innovation is into HDR and super high refresh rate which I don’t care about. And prices seem artificially high.


Interesting. I find innovation in TV's to be opposite of my desired direction. Smarts that make them slower, wifi, ads, slowness,forced firmware updates, slowness, unfathomable picture controls and auto magic colour correction that's Gawd awful, unresponsiveness and slowness. So I'm more likely to use my monitor as a TV then to want to use tv as monitor. I'm also clinging to my 2008 46" lcd tv that just works, and am stunned by how often my father in law has to call for my help with his shiny 76" tv which is showing blue screen of death or mandatory update or things have moved or icons have changed or their version of Netflix / Disney / whatever app is borked or needs maintenance or no longer supported or just looks different... And slower. Always ever slower.... the slowness of response is astonishing. Reminds me of new cars where if last driver had volume set up max, you can't kill the radio or volume until car is done telling you about its disclaimers and boot up sequence and pretty animation.

If you can't move in menus or mute instantly, than no thank you to innovation.


Never attach a "Smart TV" to WiFi. As a rule of thumb, they'll only get worse.

Once the manufacturer sells the TV, they don't give a fuck about you anymore, they're already paid! Though they would like to push updates to send you more ads and track you.

I've never attached my 2020 LG 4k 65" to any network and it works great. I did pay $50 to add a GoggleTV CrapCast thing, it works fine for watching movies.


I was doing something similar, except I went for an LG tv for Web OS in the hopes something good would come out of it being open source.

As luck would have it, https://rootmy.tv came out for Web OS. It's still in a very basic stage, but it's better than nothing!

I'm a HomeAssistant user, so I do want my TV connected to my network.


I don't want the TV itself to be connected to the network. It should be a dumb display device. The less brains it is supposed to have, the better. I've got an old Samsung 1080p HDTV that still hasn't needed upgrading all these years, and the more shit I hear about what is going on with 4k TVs, the less I want anything to do with them.

Now, the equipment that takes that display device and actually connects it to things like Netflix, Amazon Prime Video, Apple TV, etc.... that IS a device that I want connected to my network. It's called an Apple TV, and definitely is connected to my network. And it connects to the Samsung TV over HDMI.


Awesome! I had no idea this existed. Look forward to giving it a try.


Does anybody know if there are any resources for tampering with the hardware itself to remove networking capabilities from a smart TV? I'd like to just use it as a big monitor, but if there's still a possibility of automatic network activity without my consent then I'd rather just rip out that functionality entirely


Well, you could always investigate how to detach the wifi antennas, though it’s possible your Bluetooth will also stop working, and most TV remotes are Bluetooth these days.


Maybe broadcast a WiFi SSID that blackholes everything?


I don't understand... why don't you just not connect it to your network?


> I've never attached my 2020 LG 4k 65" to any network and it works great.

I guess that would work as long as you move to a place far away from humanity. It definitely won't work if a neighbour runs a WiFi hotspot without password protection.


What would happen if you configured manual network and gave it a fake DNS or sent it over to PiHole for example? Wouldnt there be plenty of ways to capture the traffic and stop it from going where it wants to go? I would think the tech savvy folks on this site would be able to figure something out if it is a big enough concern of theirs.


I have not come across a working password free residential wifi hotspot in almost a decade - because routers have had passwords by default.


I live in an apartment building downtown adjacent to other buildings. They are all around me.


My observation is the open wifi's are much more prevalent in dense living situations like apartments, condos, townhomes, and shared work spaces.


Aren't the newer smart TVs equipped with a cell modem so they can phone home without wifi, or they stop working after not being updated for x days?


Can you name some models? Between that, and user-space wireguard, I think, there might be a bleak future ahead for ad-blocking.


There is no evidence of this and we would know. Its not easy to hide wireless signals.


No, but there are definitely ones that will try to randomly connect to any open wifi networks if left unattended.


Since you claim "definitely" can you point to one?

I hear this claim a lot but have never actually found even a single one that does.


I don't think there are any? This is always a wild speculation about the hellscape to come in the future in these threads, but as far as I know nothing like this actually exist. My TV I bought last year has never been connected to WiFi and never will be.


At least in Europe having a cell modem would ruin the company, who would pay phone bills? Sure as hell not me.


> who would pay phone bills

Advertisers or data brokers? They’re already subsidizing the purchase price of TVs.


Neither of these things are true.


Disconnect the TV from wifi and enable GameMode to bypass most of the post processing. Disable HDR on the OS. On some TV models, there is an additional trick to rename the input port to "PC" and it reduces input lag even more. Use a good web app that guides you to some basic calibration, especially brightness/contrast/gamma. Those 3 things can be changed in your display driver's control panel and TV. So fiddling with both to get a "perfect" image goes a long way. Once you have the configs you like, take some pics or save them in a note somewhere in case you need to restore.

I play Steam Games on a 77" OLED like this and the only reason I dont use it for normal day to day development is I dont have a suitable recliner keyboard/mouse setup. Got the TV for $2,000 from Woot a year ago and it is the VRR model.

Why are monitors with similar capabilities at a third of the size so expensive?


Disabling HDR is akin to buying a sports car and throwing your hands in the air when you can't figure out how to start it outside of eco mode. The same for any tweaking on the driver's control panel instead of on the display. Ultimately though the truth is when you go through all of that hassle (special input ports, special input names, advanced/hidden input settings, image adjustments) it still somehow comes out worse than it should be in terms of input latency and image accuracy.

I always have 2 wishes:

- For TVs to have a mode that just displays the signal according the reference mode of that signal (particularly for HDR) and not fuck with it to make it "better".

- For a decent selection of monitors that come with larger panels.


I can't relate to any of this at all. My Smart TV (LG) is not slow or unresponsive. It doesn't show any ads and the picture control is easy to understand.


Well done picking a good TV. Unfortunately even other LG TVs are doing this: https://twitter.com/chriswelch/status/1369733357756686349

Many, many smart TVs send analytic data to marketing companies. Some even send full snapshots of the screen. https://archive.ph/DWTGC

> When tracking is active, some TVs record and send out everything that crosses the pixels on your screen. It doesn’t matter whether the source is cable, an app, your DVD player or streaming box.

> Many of the TV companies say they aren’t violating our privacy, because ACR data technically isn’t “personally identifiable information.” TVs, they say, are shared by an entire household.


Don’t connect it to WiFi and update. 2020 LG CX OLED TV randomly started to play ads after update:

https://www.theverge.com/tldr/2021/3/10/22323790/lg-oled-tv-...


>"Interesting. I find innovation in TV's to be opposite of my desired direction. Smarts that make them slower, wifi, ads, slowness,forced firmware updates, slowness, unfathomable picture controls and auto magic colour correction that's Gawd awful, unresponsiveness and slowness."

Maybe just do not connect it to a network ever. At least I don't. I've used TV as a monitor for a while but after some time strain in the neck showed up I downsized to 32" 4K monitor


In one sense, yes, but if you live somewhere where energy is expensive why have not only my TV (~30W) on but also the PS5 (~100W+ + ~30W) or gaming PC (~200W+ + ~30W) just so I can watch Netflix?

I'll choose to use the TV's app every time.

I have a higher-end LG 4K OLED with WebOS which is extremely responsive compared to others I've had, also it's rooted to block any ad-crap, but also you can do most of that at the DNS/Network/Router level if you have an issue.

Then there's the fact that the non-tech family members don't want to use more than one remote or fiddle with channels etc, my wife can press the Netflix button in the remote from TV off and have instant Netflix and tue surround sound on too.

Younger me would have agreed with you, but as a parent and husband, in my mid 30s, I just want an easy life.


Just use a Roku, why use a video game console? The energy is negligible and you only ever use one remote. The Roku works with most TV remotes, although since I rarely watch broadcast TV I use the much better Roku remote typically. My Switch automatically changes inputs when I turn it on, so don't even need to switch inputs usually.


For me, the Roku's killer features is its app that lets me instantly stream the audio to my phone/airpods so i do not disturb anyone else while still watching tv. All others require me to pair my headphones with the tv or some other painful process.


> Maybe just do not connect it to a network ever.

This is slowly but surely not becoming an option. Beyond automatically connecting to any open networks, some models already will stop working until you give them a network connection to perform a 'necessary' periodic update.


Wait, what TV’s are doing this?

Automatically connecting to open networks sounds like a potential legal quagmire, since aiui I’m not legally allowed to just use my neighbor’s wifi without permission.


Hopefully not all the models will stop working. But we will see. I am an old fart and hate all this sneaking in my backyard. I am fiercely independent and trying to stay this way as much as reasonably possible. Alexa has no place in my life. It can go fuck somebody else.


I think it would be shocking to todays people to see a TV from 20 years ago, before all that digital funkiness. Instant switching between channels.


I certainly don’t miss switching channels.

TV experience 20 years ago sucked. Now I can have all the content I could ever want immediately accessible via Plex, no need to fiddle with DVDs or gasp VHS tapes. Commercials? hah. Unskippable anti-piracy warnings? nope.


Imagine if they were exposed to a laser disk fast forward / skip. Gasp


Or an old copper phone line before cellular ruined call quality.


Maybe he got a large but shitty quality TV? I got an OLED 2 years ago and none of this rings a bell.


Possible. But this thread is a comparison of tv vs monitor. If we are talking 5000$ tv, that's no longer a relevant comparison.

(Fwiw though my impression was that it was a mid grade brand name tv. Certainly not a no name bottom tier).


Software is too much. But OP is talking a lot hardware.


Yes, where are the quantum dots, the local dimming, the mini leds that TVs have?


New models of monitor have released with quantum dot OLED which are brighter than traditional OLEDs. This is not to be confused with QLED which is simply LED not OLED.


I think better examples would be the lack of motion control, unlike computers the bulk of content experienced on TVs expects to run at specific refresh rates. Innovations like OLED which offer instantaneous response rates cause issue for that media when expected pixel dimming latency is expected. Causing motion like pan shots to appear jittery. TVS need innovation that focuses on feathering motion, since it's clear the TV and Movie industry have no plans to bump frame rate.


OLED doesn't have any new issues here compared to LCD. Both sorts of TVs will often have "black frame" insertion options, but I've never found the flicker worthwhile.

Smooth panning would be fixed by content being shot at something higher than 24fps, like you say; but I've seen that same jitter in theater screens for decades, it wasn't something that was designed to depend on CRT tech or such (like videogame lightguns).

TVs have been doing tons of "smart" picture processing to try to smooth out motion since so much content is so low frame rate. It just often looks fake!


This is why I passed on an OLED when upgrading my TV. OLED is definitely the future, I just have little desire to beta test it with jittery panning. I’m sure they’ll sort it out.


My 2017 OLED looks fantastic.

Movie cinemas are dead to me now. I would never switch back to a shitty LCD. Or even a "good" LCD.

I'm super sensitive to jitter as well, so your information about OLEDs being jittery is wrong.


Thanks for the correction. I may wish I purchased a C1 after all!


One thing I noticed about my 46" LG 4K monitor, which was "cheap" at $600-ish (edit: looked it up: $720) 3-4 years ago, is that it really isn't designed for viewing at desk distances. On my desk, as far back as it'll go, the viewing angle puts the backlight not exactly behind the pixels, so the left/right have 4-6 pixels and the bottom has ~10-15 pixels that are unlit.

I use it for a bunch of terminals, so part of the left column of text and the entire bottom line, or my status bar, were unreadable. Thankfully my window manager had an unsupported feature that let me "pretend" that those areas didn't exist.

So what I learned is that TV-oriented panels aren't just directly usable on the desktop.

However, I was more recently able to pick up a couple 32" Dell 4K displays for $300-ish each, and they are glorious! That was on a big year-end sale.


I think that is mainly due to being direct-lit backlight. I'm currently using a 43" Sony X800H, which is edge-lit, and the pixels are fully lit at my 2 1/2 - 3 foot viewing distance.


For 32", wouldn't you need an 6-8k display to get Retina-quality text? I believe that's why 32" 4k displays aren't usable for coding, unless you're ok with low resolution text.


Some of us get a larger monitor to see more text, others get a larger monitor to put further away and look at the same amount of text.

Once your eyes turn forty, see which group you're in!


I still have a month to go before 40; I'll see which group I belong to then.

However, right now, I can't tolerate screens with lower PPI than Retina.


So what monitor do you use then?


A 24" 4k Dell monitor. Not perfect, because it has a moire pattern on white backgrounds because of the anti reflection coating.


So this part is pure trigonometry: how much further does a monitor like this have to sit on your desk, to have the same pixels-per-steradian as a 4k 27"?

Yeah. That's what I mean. Just a little further away, same pixel density. It's nice.


On the other hand, brightness decreases 1/distance^2. You would need a brighter panel to achieve the same brightness at a farther distance.


Ah interesting, thanks! On which "virtual resolution" do you set it? Or simply at 2X?


2x, anything else is too blurry.


That monitor has not been for sale for some time, I looked it up. IMHO you are lucky to have one.


>"32" 4k displays aren't usable for coding"

I use my 32" 4K for just that at 100% scaling and totally happy. Fonts looks fine to me. Do not feel low res at all.


I have Dell displays with those specs and at that price point. Probably the same display. They are fine for coding if you are not acclimated to "retina" smoothness for fonts. Since they are so cheap, I bought a second one so my wife. A giant screen is just as useful for legal work as for coding, it turns out.

There are downsides. The colors aren't great. The blacks are just dreadfully light. Sometimes there are artifacts when I use the screen after it has been idle for a time, but the artifacts disappear quickly.


Do you use S3221QS with Mac?


The model on mine is S3221QS, I had looked earlier but couldn't find the order, but I remembered I have them plugged into the Dell support site. Looks like I paid $360 landed each for them.


>"if you are not acclimated to "retina" smoothness for fonts"

I am not using magnifying glass so for my eyes smoothness is just fine without "retina" prefix.

>"The colors aren't great. The blacks are just dreadfully light"

If you care so much about colors get real pro display. Just be prepared to second mortgage your house for that.

For mere mortals (I mostly use 32" BenQ monitors) colors on are reasonably fine and so are blacks. No artifacts. I do not see Apple as superior in this department.

Once again, if for some reason you need perfect colors / blacks / uniformity / whatnot Apple with its "retina" is not the one to go with. Try Eizo for example


I dunno about Retina Quality text, that moniker isn't important to me. What is important to me is that I get the amount of text on the screen without it being all blocky. My previous Dell U3011 32" 2500x1600 display was a bit blocky. The 4K 32" is quite nice for my use.

I will say that the difficulties I had getting dual 4K working in my setup make me glad I didn't try for something like 8K. In my case, I have a recent Dell XPS 15, but not quite recent enough. My docking station can't do 4K output. The next newer XPS 15 can. In the end I found that using two USB-C to HDMI cables, directly connected to the laptop, will do it, so that's good enough for me. Went from the 1-cable docking to 3, but I'm ok with that. I don't tend to move my laptop much these days.


The Retina moniker isn't important. Text becomes non-blocky at about 200 PPI. A 32" 4k monitor has 138 PPI, which is extremely blocky unless you're using 400% zoom.


Outside of laptop monitors, there are only a handful of monitors that are "Retina-quality" (>220 ppi). It is very doubtful that programming is difficult or impossible on all other monitors. Most computers cannot also drive a 32" 220 ppi monitor.


That's true. It was really hard to find a 24" 4k monitor for my desktop. It's insane that people can tolerate lower PPIs after seeing a Retina-like screen. You can't go back.


32 inch 4K is perfectly fine. I recently switched to a Samsung CRG9 (I can only have one external display on this computer) and the resolution is pretty crappy compared to the 32 inch. But either one of them is perfectly fine for coding.


I don't trust people who say "it's fine" anymore. After a lot of tests with people I know, many young(!) acquaintances are not able to distinguish between 2k and 4k. People of my age can't sometimes distinguish the difference between HD and 4k on a monitor. And that's for static images. No one I know is able to notice the difference between motion interpolation on/off on a TV.

This is going to sound dismissive and entitled, but I've learned that people's eyesight and visual processing is extremely bad in general.


I won't try to convince you if you don't want to be convinced, and from your perspective I'm just a rando on the 'net. I get that.

But I have my 32in monitor above my CRG9 (I use the 4K screen for my work computer) and I somewhat regularly end up with the same text displayed on both at the same time. The CRG9 resolution is noticeably inferior to my 48 year old eyes. Which I have to tell you, don't have the acuity they once had. 20/20 the last time I tested, but I wouldn't be surprised if it's falling below that now. Getting old blows.

The difference isn't too bothersome unless I'm bouncing back and forth between them. I get used to the lower resolution of the CRG9 pretty quickly and forget about it. And of course it's totally fine for coding.


If someone can't distinguish 4k and 2k on a 32" display then the image they're looking at has horrible quality.

Its instantly visible as soon as you see any sharp edges / letters.

2k 32" is 80 dpi, that puts it at the same pixel density as fullhd on 24". That's perfectly usable but not particularly sharp at normal viewing distances


It helps if you use bitmap fonts. They look crisp and clean at small sizes and lower resolutions. They make other fonts look blurry by comparison. The trend of scaling seems to largely be because of people using those blurry fonts. I'd rather have crisp fonts and more screen real estate than have to deal with scaling and end up with as much space as before.


Do you have any suggestions for bitmap fonts?


Terminus is a classic. I think it's a few decades old. I typically use it at 9pt on low DPI displays, but have gone up to 14 or 16 on large displays viewed from far away like a TV. tewi (8pt?) is another nice one. Scientifica is decent. If you need big font sizes, one of the only options is spleen. It gets rather large. I think OpenBSD uses spleen.


I recently bought 4 32" monitors (2 for work, 2 for home), and I wasn't remotely interested in 4k. QHD is perfect, and if you get monitors where that's the max, pretty cheap. (Every 4k monitor I've had I always downscaled it)


I also use a 40-something inch desktop monitor and I see slight viewing angle problems on the sides. For my use it would actually be better if it were slightly curved.


IPS panels are much better than VA panels for that, but unfortunately have worse blacks.

OLED is the way to go, but it's a bit spendy at the moment.


> Why are monitors so expensive and old tech compared to TVs?

(“Smart”) TVs are subsidized by the push advertising and analytics crammed into their firmware.


(Reputable) source?


Bill Baxter (CTO, Vizio)

Smart TVs continue to make money for the manufacture after the sale by providing data to viewer measurement and consumer research companies and through all of those apps they integrate in the TV’s smart functions and subsequent app usage.

“This is a cutthroat industry,” Baxter went on to say. “It’s pretty ruthless. The greater strategy is I really don’t need to make money off the TV. I need to cover my costs.”

http://cjni.com/smart-tvs-too-smart/



How about Vizio's Q4 earnings report? You can see that "Platform+" accounts for nearly twice as much profit as hardware in 2021.

https://investors.vizio.com/investor-relations/default.aspx


Because money is "funny" inside companies, and because there's no platform sales without hardware sales, I would think that the relevant metric from an external perspective is total revenue (The change in fraction of revenue over the years is probably interesting too).


I use a LG CX 48 as a monitor.

Every time I switch it on I get an ad for Apple Music that I can't disable.


Lol where have you been? There must be a dozen or so articles on HN in the last year alone.


The thing that really bothers me are all these "AI" technologies in my new LG oled tv that don't actually get used. All of them to reduce "crushed" images, correct color, improve blurriness in motion scenes, etc are only used when not using HDMI/PC hookup. My feeling is they're used when viewing things streamed on Netflix/Hulu/HBOMax/etc. But I spend most of my time using the TV as a computer monitor. I'm in a niche group, but this was my only option for large-format oled.

I wish they did more with HDMI, even though HDMI is being phased out. I want to hook up my computer and have the computer gain an ethernet link from the TV, even though the TV is wirelessly connected to my router. The TV should be a "dock" that includes a 2nd monitor. I have a wireless controller reciever plugged into the USB port on the TV. I want that to give input to my computer from it. I also want my computer to charge while being hooked up to my TV. I think the only answer is this TV should have type-c and do all those things as a dock, but it's frustrating that we're 1 step behind.

I wish I could watch a program, while showing my computer hdmi input picture-in-picture. Hell this thing has 4 HDMI inputs on the back, let me do each input to a quarter of the screen. Another niche use..

Don't even get me started on the ads. Doesn't make sense to keep the TV on when my computer is locked for long periods of time, so I generally come back, turn on the TV, and unlock the computer. First thing I see is fucking ads. Takes 2 clicks on the remote to dismiss but it colors my experience that they're always pushing another $30 or $40/month service when I first see things onscreen. They know you can't get this quality elsewhere so they're happy to push you ads. And the telemetry, my god. It's my $3k TV!

Most TVs are still waiting to support the next hdmi standard so you can do 120fps & hdr simultaneously.

The 2 things that did impress me were this LG tv supporting both Miracast and Airplay with /no/ hoops to jump through. It just worked. I do wish I could "cast" things to the TV, and that's like pushing a link to it where the TV navigates to that stream and plays. No other device has to stream or push the video to the TV, it does it itself in the Chomecast paradigm. That would be nice.

</rambling>


There is never going to be a single product that has a great display, a great casting solution, a great app solution, a great privacy solution, a great input solution, a great docking solution, a great selection of leading edge technology, a great content mixing solution, and whatever else you could want at a great price. Apart from such a thing inherently needing to cost an arm and a leg it needs to outdo every best of breed, it's just not possible.

On the other hand what is possible is buying a great display (link a 120 Hz HDR OLED), buying a great dock (thunderbolt 4 doing power, ethernet, and more), buying a great app/casting solution (like a Shield TV providing Android apps and Chromecast casting), buying a great multiview HDMI matrix, and so on and connecting them to the TV with the advent of HDMI CRC making it so you never have to manually adjust sources (excluding the multiview case where you want to see PiP versions of multiple sources at once in which case there is no adjusting outside adjusting what is PiP'd). This all comes with the upside when you want a better display or a new technology comes along you don't have to replace all $$$$ of it at once.


I was mostly remarking on the lack of (significant) innovation. I'd say the best feature here has been airplay/miracast for drive-by screen sharing from my sister/mom/dad on their iPads or phones.

For what we pay, I wish they'd consider some of these niches rather than startup ads or an AI that looks at what you're viewing and suggests other channels mid-viewing. Creeeeeepy.


>I wish I could watch a program, while showing my computer hdmi input picture-in-picture. Hell this thing has 4 HDMI inputs on the back, let me do each input to a quarter of the screen. Another niche use..

I've seen some business monitors that will show a 2x2 grid of your inputs. I think you can get some sort of device/box to add this feature to existing screens.

I really wanted this feature a few years ago, but hadn't thought about it lately. I wanted to turn a 65" 4K TV into 4 1080p monitors, basically. When a PC is hooked to it, I run i3wm, but this doesn't handle cases like someone hooking up a game console or another PC and wanting to show both. Maybe with some capture cards you could display other inputs in mpv.

If we could put our own software on TVs or swap out the boards with more hackable ones, that would be cool. Just buy a dumb display with speakers and let the free software community handle the rest.


The new LG C2 OLEDS have PIP. It's demonstrated in this video https://youtu.be/FrVJLlzQcQw

Even my 2017 C7 has it, but it's pretty clunky - no overlay and only two small screens next to each other, and it's pretty slow to enter/leave that mode as well, maybe 3s of black screen before it enters or exits that mode (I think it's called multi-display).

I wouldn't use it, but it's there.

Personally I don't want the TV companies experimenting with more networking options, because I know they'd only use it for evil.


When looking for Monitors with these options, I found the term "PBP" (Picture-Beside-Picture) useful for searching.

LG has one, the 43UN700, which also has a serial port for controlling picture-input layout options, and sound-source. It has a mode for showing any4of6 inputs at the same time, and a bunch of layout options for any2of6 ( Top/Botton, Left/Right, Main/PiP) inputs. Sound output can only come from active sources, though. My only "software gripe" up to now. It comes with a ton of cables, but NOT the required RS232C (looks like audio-jack) adapter. Hrmpf.


Ever since I got the new MacBook Pro, I've wanted a monitor that would be its screen but 27" of it, in 5K resolution. But apparently I'm asking for something impossible. Or maybe Apple announces a "Studio Display XDR" next year? Who knows.


I've had an UltraFine 5K for around 4 years now. Is that not what you're asking for?

edit: if you meant aesthetically, yeah, no, it's ugly plastic.


The new Studio Display is the same panel as the UltraFine but ever-so-slightly brighter, and with a better shell.


I'd like a higher refresh rate (120 hz scrolling feels so much better) and, if at all possible, same HDR capabilities, or at least truer blacks. And preferably also glossy finish.


Unfortunately, Apple uses Thunderbolt, and they're the only ones who do 5k/6k. We absolutely have the bandwidth for 5k120 over DisplayPort, but Apple insists on sticking to Thunderbolt.


Isn't Thunderbolt 4 supposed to support some ridiculous bandwidth too, like tens of gigabits or something?


5k120 is 57.08Gbps, Thunderbolt 4 is 40Gbps. Not enough without compression.


Of course this is a rumor, but if true, they can somehow make it work.

https://www.macrumors.com/2022/03/10/studio-display-pro-laun...


I thought thunderbolt used DP for video.


Thunderbolt doesn't use DP for video when in Thunderbolt mode.


I'm far from an industry insider, but I think it could be LCD panel factories needing to be set up for specific sizes in conjunction with mainstream TV sizes going up rapidly over the past ~15 years. That is, the most-updated factories are the ones chasing the TV size trend, and monitors tend to get stuck with the output from the stragglers.


> And prices seem artificially high

I switched companies last year and was gonna return the screen they gave me to use for home office, and buy my own. I looked it up, and the screen had cost like 700 USD back in 2016. So I thought I for the same price, 5 years later, I would get a sweet upgrade.

But no, basically same specs. Same panels, perhaps upgraded a bit, but not much had happened. Prices were the same, perhaps because of the pandemic. Luckily my previous company ended up gifting me the screen.

My new company ended up giving me the "newer" version of the monitor at the office. Only difference I can see is that it now charges my laptop via usb-c. Neat, but not much innovation in those 5 years.


> affordable large monitor with 120Hz refresh rate

This year, I went through four different monitors to find one that works. Stay away from IPS panels, they all suffer from "IPS glow", which is visible when using high contrast colours (i.e., a bright window on a dark desktop background will blast a translucent white "overlay" above and below the window). "Smart" 4K TVs are untrustworthy, IMO (e.g., Samsung is known for spying/spyware and inopportune ads, making them a hard pass).

The ROG STRIX XG43UQ was the only display I could find that runs at 120 Hz, works with a KVM switch (IOGEAR 2-Port 4K DP), has a 16:9 aspect ratio, offers 4K resolution, uses a VESA 100 adapter, and is suitable for programming. The OS must be instructed to render using BGR instead of RGB, which Linux supports. The panel has some subtle horizontal glow in rare high-contrast situations, but it's nowhere as noticeable as IPS panels.

Depending on your definition of affordable, it runs for about $1,300.

https://rog.asus.com/ca-en/monitors/above-34-inches/rog-stri...


I can get a 2021 65 inch Samsung qled tv for $1k and it has 100% dci-p3 coverage compared to 90% of the monitor, has HDR and 120Hz refresh rate, has usable speakers, has a good remote to control it and is larger and cheaper. I don’t know the numbers, but I bet the contrast would also be better on the TV.


On any Mac you won't be able to get 4K 120Hz if that TV just has HDMI.

It only works on monitors/TVs that support DisplayPort.


Does the 4K 120Hz actually work properly? There are a lot of TVs that claim to support 4K 120Hz but actually display the signal at half vertical resolution as 3840x1080.


I assume you're looking for an LCD monitor... not OLED or something novel. I'm not sure why you would want 120Hz, that seems like the only challenge to a competative price. Most LCD materials have long response times (>5ms) so they tend to blur at high refresh rates. High refresh rates at high resolution are also a challenge and thus higher cost. If you're looking for 4k, it's also worth noting that most non-premium TVs will use 2subpixel (RGBG rather than RGBRGB) rendering for higher brightness ans lower cost whereas most monitors will use 3. There are also economies of scale to much larger glass >40" which aren't usually seen in monitors. I am surprised how relatively poor the color matching on many monitors is, however.


I really enjoy the 49” ultrawide 1000R format Samsung has been pioneering. It’s the size/resolution of two 27” 1440p monitors next to each other, has a buttery smooth refresh rate, and a curve that makes viewing more comfortable and natural.

The new quantum dot displays are also very innovative.


What do you like about the qDots vs "regular" pixels?


Samsung's newest qd displays are qdoled so you get extremely fast response time and very high contrast ratio and the dots give high color saturation so the color volume is very large. Brightness is also high (for an oled).


They are just an improvement on the OLED technology, with reduced burnin risk and brighter pixels being the largest improvement.


This reminds me of my rant about televisions versus monitors on my 2014 blog entry "4K is for Programmers."

https://tiamat.tsotech.com/4k-is-for-programmers


The problem is that the monitors are only about gaming. In particular, I find the "1440" Y dimension resolution particularly infuriating.

Yeah, I know the reason is that modern games make all the graphic cards left over that aren't being used for crypto die in horrible flaming balls of heat when you actually ask them to ... you know ... actually use the pixels on your monitor. GASP! The Horror!

So, how can we fix this? I know! Let's make sure the monitors don't have enough pixels to cause the graphics cards grief. Brilliant!


1440p (and by extension 5k) have advantages from a productivity point of view too. In my experience, no OS really consistently supports fractional scaling, so if you want a good time you want to run at 1x or 2x. In terms of how much "stuff" you can fit onto a screen at a comfortable size on modern monitor sizes, that means you'll end up using 1x for 1080p/1440p and 2x for 4k/5k.

1440p@1x or 5k@2k let you fit more than 1080p@1x or 4k@2x while feeling plenty large enough with modern application design on a typical (24/27 inch) sized display.


I think things are changing; at CES this year QD-OLED launched on both TVs and monitors simultaneously with the AW3423DW, and there are some 42 inch OLED monitors derived from TV panels.


LG also seem to be launching a 32” 4K OLED monitor this year, but it’s really expensive.


My main monitor is Dell u3014, 60Hz. I had a Dell 3008 at work and replaced it with 3017 when it died, so I had this tech for over 10 years both at home and at work.

What am I missing with 60Hz compared to 120Hz? Honest question. When I look at friends' office setups with latest curved 4k (or now 8k) monitors I do not see anything that I like more than my current monitor, so while we are generally not limited in workstation or monitor options at work I see no reason to upgrade.


Honestly for office worker/programming/professional use, high refresh rate doesn't make much difference. It's neat that windows are more legible while dragging or curse movement looks smoother, but I wouldn't buy high refresh rate if I wasn't also a gamer


If you don't feel you are missing anything without 120hz then don't try it. Once you go back to 60Hz you feel as soon as you move the mouse cursor; you can never be content at 60 again.


hm I think the industry was stale for quite some time but lately it has picked up a lot. What exactly are you looking for? Today you have high refresh rates,4k or 5k screens, (curved) Ultrawide, Variable Refresh rates, low latencies. Comparing them with TVs is not entirely fair, as those usually are not great for displaying text unless you get a specific panel with good chroma subsampling and in terms of latency TVs are usually also pretty far behind Monitors. The really good TVs are equally as expensive.


I've seen trading firms do this. Why have 8 monitors when you can just have one giant screen? All you do with it is show some app, doesn't need to be special.


I'm am curious about USB-C vs others. My MBP takes 10-20 seconds to wake up on an external USB-C monitor. Is that a MacOS issue or a USB-C issue?


I haven't had this issue on Windows, the time to wake seems to be identical for usb c vs other connection methods?


The m1 macs are pretty much instant, might be an intel issue


Both my Intel & M1 macs wake up instantly with external usb-c monitor, might be a monitor issue.


Your monitor.


The Dashews, a couple who design offshore boats (sail and power), after a few years of experimentation, ended up coming to a similar design for their navigation station their FPB line of powerboats:

* https://setsail.com/rethinking_modern_nav-2-2/

May not be able to do this on the deck of a traditional sailboat where things are exposed to the elements, but in an enclosed wheelhouse it adds a lot of flexibility, especially with modern the NMEA 2000 data bus and various vendor 'black boxes' (or leverage a Raspberry Pi), e.g.:

* https://mvdirona.com/2016/09/maretron-n2kview-on-dirona/

Aside: the owner of the above linked Dirona is James Hamilton, VP & Distinguished Engineer at Amazon.


Looks great until the first big wave weather where you lose ground and lean on/kick the big shiny TV losing all the telemetry.


I'm skeptical how well that display on the left side of the nav-2-2 URL holds up for visibility under direct powerful sunlight. Even if it's a very expensive model.


Wow. I am blown away by the photos in that FPB link. I'm not a boat person but it's unlike any boat pilot house I have seen before. A great room indeed.


The Dashews seems to regularly think 'out of the box'. They took their decades of experience in sailboat design, especially to hull shapes, and applied them to a power boat (so they could continue to travel in old/er age):

* https://setsail.com/the-concept-explained/

* https://setsail.com/intro-to-fpb-program/

* https://setsail.com/category/fpb-78/

YT channel:

* https://www.youtube.com/user/DashewOffshore

A tour of one of their earlier sailboat designs that is/was for sale recently:

* https://www.youtube.com/watch?v=yrDxYkSI710


From the comments:

> I’m curious as to how this affected your productivity

Answer:

> Productivity is through the roof. Even when noodling about on small side projects I find you so quickly end up with so much things open that you're constantly flipping between, so to have multiple terminals, text editors, reference documentation, version control, etc, immediately accessible is, honestly, life changing.


Alternate take: I achieve constant-time access (mod key + digit) to all those things with 10 workspaces in i3.

I can't imagine having to see them all at once. Realistically, I'm only actually thinking about 1 or 2 of them at a time. If I need to check documentation, I'm no longer thinking about my build output. Slack, email, and calendar are disconnected from development tools. And if I need to do a quick context shift, I can switch over instantly.


I've never used i3 but have been intrigued for ages. I'm finding myself more and more "faking" a tiling window system on my work Mac. I generally open a separate zsh/tmux session in a vscode pane for each workspace, which works great. I'm now also adding a split window to the left for note taking, initially Gdocs but more recently just another vscode window for markdown files. I merge all my other coding workspaces into one window on the right, so I can have the same notes on the left but switch projects on the right with ease.

Curious if something like yabai would be even better but just haven't made the time to try it.


Have you tried Spectacle? https://www.spectacleapp.com

As a long time i3 user that switched to Mac, it gives me almost all the benefits while still integrating nicely with the general UI. It's also not nearly as steep of a learning curve and does not get in your way at all.


Thanks for recommending! I will check it out.

Edit: "Spectacle is no longer being actively maintained Download Spectacle"

I guess it still works for you though?


Spectacle runs just fine and is worth looking at. For me, it didn't really scratch the itch.


I find I'm a lot more productive if I can see everything I'm keeping in my head all at once. Virtual desktops are good for organization, but for me, don't substitute for a large display.


workspaces don't work for me. I constantly need to see 2-3 windows from different apps together and which 2-3 change constantly meaning to see them together I'd have to move the windows across workspaces every 10-20 minutes.

Documentation + editor

Editor + Terminal

Editor + Issue Tracker

Terminal + Issue Tracker

Issue Tracker + Testing Result

etc. etc.

I don't "context switch". My context is "getting shit done" and for me it requires access to many many windows at the same time.


Check out dwm. It has a different concept from workspaces called 'tags' which is close to what you are describing.


FWIW, moving a window to a different workspace is also constant-time (shift+mod+digit). Maybe I'd spend another half second positioning it among the tiles (shift+mod+vim navigation keys), but only if I'm just starting a project.


I do the same and have triple displays that is placed with ergonomics in mind, so workspace is like 3*n mostly.

There's no drawback for me except probably terrible carbon foot print..


Sorry, what impact does i3 have on your carbon footprint?


Not i3 - the energy consumption of running three large monitors.


But the OP's point was that using workspaces in a tiling window manager removes the need for so much physical real estate. If you have a very fluent i3-based workflow, the limiting factor becomes your own human capacity to focus attention, at which point a single decently large monitor (that can hold, say, 4 windows comfortably per workspace) is all you'd ever need.

3 monitors is just a workaround for a workflow problem that i3 can solve.


And the post I clarified says:

> I do the same and have triple displays that is placed with ergonomics in mind,

meaning that the author uses both i3 and triple monitors.

I use a laptop and two 27" externals, not with i3 but with an auto-positioning Hammerspoon that has a similar effect.

When I'm on smaller or fewer displays I can't fit everything I want up simultaneously.

Granted, I relatively often have a mobile app codebase, a backend API codebase used by the mobile app, terminal sessions pertaining to each of those codebases, and documentation for the tools used in either or both projects up all at once.

My eyesight isn't great, so all of those have fairly large font sizes.

Not every project needs that much context. When I'm hacking on my Emacs config, a two-pane split is about I can ask for (and one will do the job just fine).


that's also my philosophy (in sway). in my first workspace I keep all my terminals that belong to a specific task ws 2 for another task ws 3 for web browsing, ws 3 for irc|mail|signal etc ... and most of the time I am in fullscreen so I won't get distracted (by things like polybar/waybar etc) ... really did wonders for my attention and productivity.


I definitely agree with that take when talking about running two 4k monitors at full resolution, but it doesn’t address the size at all. The more reasonable approach would be to use two ~38-42” monitors.


One 32 inch 4k monitor massively improves my ability to get stuff done. I'm tempted to get another one to see how it goes.


I used to love "smaller" fonts and whatnot. Unfortunately my 45 year old eyes can't do it anymore. These days I stick with 2 32" at QHD. I've always used my laptop open for things like the terminal etc, but I find that's a bit too small unless I'm sitting directly in front of it (coffee shop etc)


Kind of funny that the idea of constantly flipping between a bunch of things increases productivity. I’d imagine if this person had instead limited themselves to a single 15” laptop screen their productivity would’ve increased over their baseline because they’d be focused on one thing at a time.


Honestly doubt it - I watch files/diffs side-by-side constantly - at 120 line length with a 15 inch MBP there's just no way even if I kill the explorer panel - I can just barely fit it when i zoom out one level from my comfortable font size and have the panel closed - but that's not a great experience for me.


My point was that committing to a change that you think will make you more productive is likely to make you feel more productive.

People don’t like to admit they’re wrong and really like to convince themselves they were right.


Don't see how it increases productivity compared to normally sized 4k screens? Can fit just as much into them?

Edit: Or might be able to not scale text as much. But that is the screen being too close I feel.


Slight skepticism should be applied to any productivity claim. There was research to determine if changing X in the office boosted productivity. But in the end it was found that X didn't matter, within reason. Changing any X boosted productivity and the benefits often diminished with time. I thought this was called the Hawthorne Effect but I am no longer sure.


Hawthorn Effect is right although it is disputed [1]. Interestingly, the interpretation that I remember is that productivity increased because the subjects were part of an experiment and being observed.

[1] https://en.wikipedia.org/wiki/Hawthorne_effect


I am not sure if I understand this claim. It looks like this setup has the same density of information as a typical 27in 4k monitor.


So, I look at this and it reminds me of a trivial experiment I did in a hotel room, which is sort of the opposite of what these people are doing.

Simply, I took my phone and held it up in front of me, a comfortable distance from my face, and covered the hotel TV with it.

At that point, it's easy to realize that the phone screen isn't necessarily too small, at least for "TV" watching, because, net, the screens are the "same size" in terms of consumed field of view.

In the end it was the same if you could watch it comfortably (which is a different problem).


It does make a difference on eye strain, since they need to focus to a point much closer.


You can also shift to more postures the farther back the display is and it will still take up roughly the same field of view no matter how you move, whereas with small monitors you can't both sit up straight and lean back without drastically changing pixels per degree of vision.


A constant focal point is not good for eye health either. I had considered converting my two monitor setup to one traditional monitor and one project that is > 10ft away. But projectors have too many disadvantages atm (e.g., high power consumption, brightness, etc.).


Perhaps an alternative would be to suspend a screen from a drone that is constantly moving around the room. It would combine eye muscle exercise with neck exercise. For extra points, the screen could be configured for me to chase after it, providing ever more opportunities for exercise.


Aside from the eye issues, with a phone size screen mounted externally, moving your head 2 in would be like moving your head 2ft, so you couldn't have any variation in your posture without the screen being uncomfortably off-centered. Think of a movie theater where you can move several seats or even rows from the center sweetspot seat and still have roughly the same view.


I’ve found that a 32” QHD monitor is the highest PPI my eyes can comfortably read. I tried to use a 4K TV as my monitor, but I needed something like 55” up close to read it comfortably. It was simply too large and I had to move my neck too much. Also, that was the day I learned some pixels are chevrons!


Windows supports fractional display scaling with perfect crisp text and graphics. MacOS doesn’t, but it can do integer scaling at least. (Edit: I’ve been corrected. Apparently macOS does do fractional scaling, but in a way that causes blur). Text size and resolution don’t need to be bound.


macOS has scales between 1x and 2x. It doesn’t tell you what the scale factor actually is as a number, just a couple of options between “More space” and “Larger text”.

IIRC the way it’s implemented is by rendering 2x and downscaling, so sometime’s it’s not as pixel perfect as Windows. But the simpler implementation for software developers meant they had like a 10 year head start compared to Windows where software takes a long time to support up new OS features.


APIs existed for fractional scale and assets in early Mac OS X versions (10.4 I think). I remember building out 1.25, 1.5 and 2x assets for an application at the time. These were never to be shipped to consumers for a few reasons.

There were intractable issues with window spanning across displays with different scale factors. Ultimately this was resolved by not allowing window spanning on the platform anymore.


> Ultimately this was resolved by not allowing window spanning on the platform anymore.

You can still window span on macOS. You just have to disable the feature that gives every display its own workspaces.

Not that you’d ever want to do that. At least I can’t think of a single good reason why anyone would want this.


This display spanning issue exists in Windows but I think Microsoft made the right trade off by just allowing that one thing to behave strangely. The other issue is pixel-based UIs that don’t fractional scale without blur, but at this point that doesn’t affect any software I use except little utility programs that I’m not staring at for very long anyway.


Wasn’t there an early exploration of using vector UI elements some time around then? I have a vague recollection of it being found as a partly implemented feature.


There are also 3rd party apps for finer control of resolutions and displays on macOS, such as SwitchResX


On my monitors and most recent OS versions it shows the resolution when you hover over the preview.


macos fractional scaling causes massive system slowdowns, completely unusable for coding. integer multiples are fast


I don’t think this has any effect on speed even in post-2015 Intel Macs


i use linux on a retina macbook @ 2880x1800 and fractional scaling works well in swaywm. they warn against it in the docs, but no problems so far (though it's only been a week)


So use scaling?

I find this a strange complaint, (not saying you’re wrong!) as to me more pixels should mean better rendering of elements, not dictate their size.


When I replaced all my monitors, I just went with QHD displays instead of 4K, even though I've always scaled in the past. So much cheaper.


right around QHD is the sweet spot imo. I use 3 30" Apple Cinema HD Displays @ 2560x1600 (my school was throwing them out), and theyre perfect for everything. games, code, movies, whatever. Needs dual link dvi though, so old gpus only.


I was so sad when my old super-IPS korean yamaguzi died. 2560x1600 30" but incredible clarity, modern monitors with higher PPI look so bad and fuzzy compared to it. Dual link dvi for the win!


32" QHD is ~ 92ppi. Are you saying that you can't differentiate any greater than that, or you get actual physical discomfort above that? If you are talking about UI size - that can be adjusted.


I have the same, from cca 2 feet I really don't need more for anything, including occasional gaming (where performance difference vs 4k is huge, visible benefit for me is 0). If I am running out of space that just means I am messy and doing too much in parallel which is never a good idea. Something similar to coding principle of having method/function no longer than 1 screen.


My screen on my main PC is a 32" BenQ QHD display. I run it at 100% scaling. I don't like the idea of 4K displays at >100% scaling, as I'm just wasting money on pixels I don't actually use directly.

To be honest, the DPI on this monitor is a little too low... I have a QHD 27" on another machine, and that seems 'just right' (for my eyes).

If 4K monitors were the same price as QHD ones for the same screen size, then maybe I'd re-consider, but whilst they are sold at a premium, I'm not interested.


At this scale, I'm concerned about latency due to the speed of light travel time.


If you aren't gaming then latency concerns are really overblown.


They were obviously joking talking about light speed, but I strongly disagree. Even ignoring how it improves the feel of typing, I'm a mouse-using heretic and it makes a huge difference with mice.


It's just a joke, light only takes 3 nanoseconds to go a meter.


could be 2 nanoseconds if you built it in Rust.


I propose a new movement: RRIR instead of RIIR

Rewrite Reality In Rust!


I have four monitors connected to my Mac. The two centrals are ultrawides (34UB88-P) on top of each other - the lower one is the main window where the action happens - the one above is the secondary where terminals and other monitoring processes happen). The main is large enough for work and references to be next to each other.

The 4K on the right (run at hi-dpi) is for various non-work chat programs, reference images, etc. the 4K on the left is for work chat and Finder windows, and the laptop display itself holds email.

It works well. The main advantage is being able to glance at chats without switching windows.


As a fellow multi-monitor geek, I’d love to see a photo of this setup!


How big is your desk?


It’s a normal size desk. The key is lots of vesa mount arms.


Do you think such a setup would be helpful for an elderly relative whose eyesight is getting worse? Even with glasses, the relative has trouble with a 27-inch desktop monitor with 2560 x 1440 pixels. Specifically, I was thinking of using an Ultra High Definition TV (49-inch with 3840 x 2160 pixels) but setting the resolution at 2560 x 1440. So it would be the same resolution but everything would be bigger. Would it help?


I used to work with a guy that was very visually impaired and he used a 40" monitor set to VGA resolution. He would still have to crane his head to the left & right sometimes to follow the mouse on screen.

Going to something like a 49 or 55" television could work but you need to check the field of view at the distance they'll have it set up. Get a piece of cardboard at your largest anticipated screen dimensions, prop it up at their normal viewing distance, then put some high contrast marks on the edges & in the corners and see if they can make them out without too much head rotation. If they can't then cut the cardboard down to the next smaller TV size and try again.

Once you have bought the right sized TV, then adjust the resolution so they can make out screen elements (the window close button is a good one to test against). Then adjust the mouse size, browser zoom, etc.


It can help for sure. The longer focal length itself is a huge difference but I'd test it out somehow (renting, borrowing etc.) before investing into something like this

>I was thinking of using an Ultra High Definition TV (49-inch with 3840 x 2160 pixels) but setting the resolution at 2560 x 1440. So it would be the same resolution but everything would be bigger.

I'd rather use OS level UI scaling you will have sharper and better picture


> I'd rather use OS level UI scaling you will have sharper and better picture

I tried various OS scaling methods. The problem with "magnify or zoom in on the whole screen" is that it's very cumbersome for an elderly person, easily getting lost on where you are on the screen. The problem with an OS configuration to make all text and icons bigger is that it doesn't do it for everything — many text labels, buttons, menus, etc., remain at a small font.

That's why I'm hoping a hardware-level solution (a much bigger display) is the answer.


Couldn't you get an equivalent focal length by wearing reading glasses?


Only to a limited degree. The problem is that the eye becomes inflexible with age, lowering the ability to modify it's "lens power".

Lens power is defined as the inverse of the focal length. Lets say you have glasses that, when combined with you eyes, create perfect focus at a range of 5m. If you look at something 10m away, you need to add (or subtract) the difference by modifying your built-in lense.

The lens stength adjustment for your eye needed for that, is

dP = 1/10m - 1/5m = -0.1m^-1. This is not much, and even old eyes can do this easily. Also, even if the eye is not able to do so, the image is going to be pretty sharp at 10m or any greater distance away, or even at distances of about 3-5m.

Now consider having your lenses adjusted to be perfect at 0.5m away. If you move the object you look at to 1m away, you get:

dP=1/1m - 1/0.5m = -1m^-1

This is still easy for a young person, but for someone older, this can be straining, or even impossible. Also, even slightly away from the perfect distance, perceived blurriness will be significant if not able to adjust.

Furthermore, to look at something much further away, eyes must be ajusted by dP=-2m^-1. This is doable for a young person, but an older will move around in the fog.

In other words, a screen that is 3m+ away from you requires a lot less lens compensation from the eye (and thus far less strain) than a screen 50cm away, even with perfect glasses, as long as you are not sitting perfectly still at that distance. Even variation in distance between the center of the screen and the edge of the screen may be enough to cause problems.


Thanks for taking the time to write this. I learned a lot.


Why not just set the desktop resolution to 1280x720?, as it's exactly 1/2 that of QHD, so everything will be twice as large, at perfect 2x pixel scaling.

For the 4K screen, 1920x1080 is 1/2 4K so same point applies.


Why change the resolution and not the OS scaling?


Depends on the OS.... not everything scales properly, especially older applications.


My grandmother used to have difficulty reading her email, even with double font size, low res, etc. Over the summer, I replaced her monitor with a 32-inch 1080p TV that I had lying around, and it makes a big difference.

I think your idea would definitely help.


No. If their eyes are unable to resolve the smaller pixels it will offer no benefit. Would be better to get a large standard resolution screen. However, i would avoid smart TVs as they are extremely consumer hostile and may blare ads over your relative's attempts to use the computer. A 27"+ monitor made ~10 years ago would be ideal.


> If their eyes are unable to resolve the smaller pixels it will offer no benefit.

Sorry, I don't understand. The pixels would be bigger in the idea I'm proposing.

> A 27"+ monitor made ~10 years ago would be ideal.

That's exactly what they have now. I'm suggesting replacing it with a 49-inch monitor (a Ultra HD TV) set to the same resolution. The pixels would therefore be bigger.


It depends on the distance they can best focus on. Then choose the display size accordingly. I.e. a huge display won’t work if they are near-sighted.


How big is the monitor? A 27" 1440p monitor is about 130ppi, a 32" is about 90.


The biggest challenge with a setup like this would seem to be mapping the user’s focus to the UI focus. Traditional pointing devices and window selection highlighting would be suboptimal.

But I expect these problems will be solved via gesture or eye tracking + new modes of interaction for AR/VR. In particular, curious what Apple’s going to end up doing for their headset.


I don’t know if we’re talking about the same problem, but I previously solved something similar by binding a key that modified mouse speed. So if I need to go large distances with the mouse I hit the modifier and it zipped over super fast, let go of the key, and was back to normal speed.

I replaced my desktop/keyboard/mouse with an Xbox controller/rpi for a while in college just for kicks. Worked great.


This reminds me of a vivid dream I had in university, where the room I programmed in had source code displayed on the floor and I could scroll or walk around and browse it. The listing on the floor had that wide green-and-white band printer 'paper' look. Don't remember if it went all skeuomorphic with rendered perforated edges and holes. It also had other printer type things that were just sitting there not printing at the time. And for a reason I never understood there was a large pixelated volumetric figure of my housemate Bill also in the room like a 3/4 life-size Lego figure. As I interpreted it at the time, I didn't think the figure was generated in real-time as to be able to instantly into something else. It didn't feel virtual, and augmented reality wasn't a concept at the time. I suppose in theory it could be fully rendered in roomscale VR if my hands and arms looked and moved accurately enough.


I find curved displays much nicer to look at on the edges. I hope spherically curved monitors come out some day too.

I use a 32" 2160p 4K curved display (1500R). It's harder to find than the 1440p curved displays but they're out there.

Also, I find I prefer looking straight in front of me best and don't really use anything more than a single 4k display. I try to improve my app switching skills & tools before using more monitors.


This is a scaled up version of my setup for my adult son, who is legally blind. We setup 2 of these as our "zoom" stations at the beginning of the pandemic. A good chair/sofa up close the TV, 55" UHD TV connected to a Lenovo laptop with a wireless keyboard and trackpad combo. My son's TV is also connected to the cable box, WII, DVD players (Reg & BR)

These are not some sort of high performance workstation just a cheap and dirty solution. Everything was in the house , Laptops from 2016, TVs and keyboards from the last couple of years of Black Friday sales and a few HDMI Cables.

I saw someone else's setup with a dual recliner that had integrated Fold away TV tables that the keyboard and mouse sat on. It was a gaming setup.


I recently turned 50 and though I've had to wear glasses for long distance vision my entire life, my arm-length distance has only just succumbed to age. I just got a new prescription, but for the past year I've been bobbing my head back and forth to my screens like a chicken.

The solution is now bifocals, or multi-vision contacts which leave everything - both close and distant - slightly blurry. I can't say I'm in love with either solution, but that's life.

But hey! I like this idea! I could dial in my contacts prescription for 20/20 distance vision and just sit back in my La-Z-Boy. It might be a bit awkward going back to the office...


I’ve used 4K 39”/40” televisions (originally 3, now 1) for the past 8 years. I find that the panels look great, and they give me a huge working area. They’re also really reasonable - $250 - $350 for a good TV. They last a while.

Smaller 4K/5K panels (with more pixels per inch) are nice, but I never understood the push for density. I’d rather have more workspace with lower density.


Higher PPI makes things clearer and easier to read for me.


Hi-DPI panels gives you more choice for UI density.


I use a 43" 4K LG monitor (not TV) as my every day work screen, positioned about five feet from my face. I have macOS set to scale it up about 25". I can't really say I'm more productive on it than just using my laptop, but I definitely notice less eye strain when I'm using the big monitor.


Speaking of monitors, I got one of those absurd 49" ultra widescreen displays a few months ago. The resolution is 5120x1440, so it's functionally equivalent to two 27" 1440p monitors placed side by side.

The text is naturally not as crisp as a 4K monitor, but the amount of screen real estate is great for development. Paired with a tiling window manager it gives so many options. I did the dual 27" 1440p thing for years, and I find this setup superior. With the dual monitor strategy you have to either deal with a huge bezel right in the center of your vision, or push the secondary monitor off to the side, requiring more neck movement to see what's on it. With this monitor, everything remains much more clearly in my field of view.


Biggest gripe for me with the curved displays is that 1440px in height is not really that much. Kinda blurry when one's used to a 4k display with 2160px over the same vertical distance.


Yeah, if you're used to 4K, the resolution will definitely feel like a step down. For my personal usage, I've found the horizontal space to be really beneficial for development. YMMV.


I would be less scared of one of those monitors, if I could use them as two virtual monitors - that is, the OS would split it down the middle and treat them with separate wallpapers (if desired), along with the keypresses and mouse shortcuts associated with tiling/managing dual monitors.


I believe all of them have a "picture by picture" mode where left half shows one input and right half shows another input. You could achieve what you want with 2 cables. But I haven't felt the need for such use. I use Microsoft Powertoys to tile a 16:9 area in the center and two half width areas on the sides and I am very happy with that setup. I am sure linux window tilers have even better capability.


The guy claims he uses the keyboard by hunching over the side table[1]. That is so comically anti-ergonomic (most people would get back pain within an hour or two, and be bedridden within a couple of days) that I think this is fake and posted on Reddit for karma/trolling purposes.

Consider also that to focus on another window requires the user to substantially turn his head. This would very quickly cause neck strain.

Our inclination is to believe things are real, but people do just go on the internet and lie. A lot.

[1] https://old.reddit.com/r/battlestations/comments/toecyt/dual...


There's no way anyone is doing any serious typing with such setup. I can imagine it if keyboard use is minimal. Not sure what job that would be but I'm sure there are some.


My thoughts as well. This looks like an ergonomic nightmare.


Attaching split keyboard on the armrest with a trackball or trackpoint would be a sweet spot.


Love it in theory, but the strain of moving your head to see the full dimensions of the screens would get old. Eyes are good at darting around a smaller screen up close, don’t know that the entire head and neck would appreciate that day in and day out.


This is only an issue if the pixel density of the screens is small (forcing you to blow windows up larger, which forces you to crank your neck around more to see an entire screen).

If the pixel density is high enough, you can compress tons of genuine screens together within your eye-gazing FOV, and even sit close to the screens.


Sure but then there's tons of screen area you're not using. If you're only going to look at a 32" FOV on a 75" screen, why not just get a 32" screen?


I use 3x 27" 4K monitors in portrait mode side by side with my Macbook Pro screen at the end. This gives me about 10 normal screen sized areas.

I usually have the following things open in specific spots:

  - General Browser (mostly for email and weather)
  - Calendar
  - Terminal
  - Text Editor
  - Desktop Mode Browser for App in Development
  - Detached Debug Console for the above
  - Mobile View Browser for App in Development with attached Debug Console
  - Browser for documentation / reference
  - To Do List
  - Slack


I'll present a different data point. I also used to like multiple monitor setups, using up to three monitors. And then I realized that I'd much rather have a single good quality screen, because there is less fussing around with things. A 27" iMac 5k screen got me enough screen real estate to be happy.

These days I am looking for single-monitor solutions, at 27" — anything larger and you need to turn your head.

(incidentally, 5K vs 4K makes a huge practical difference for me: in my full-screen Emacs I can comfortably fit three columns of code on a 5K screen, but on 4K this is problematic)


Honestly, the graphics quality is the reason I have 3 4K displays. I run them at 2x scaling and they are not fatiguing to use. 5K would be better though, as 1080 pixels wide feels a little tight on some websites.

A single 8K display would probably be better, but well-reviewed and reasonably priced 8K monitors are not really there yet.


Have you tried 2x 27” screens vertically, ideally models with minimal bezel?

I’ve found that it doesn’t increase head turning because it’s not much wider than a single screen, but I can fit twice as much stuff.

Also I can read a much longer file of code at once, which is more useful than you’d expect..


That's been my setup for the last decade (currently a pair of U2720Q monitors). macOS unfortunately does a worse job with "font smoothing" when using the monitors vertically, but it's fine.

The main downside is that since most people (obviously) use their displays in landscape mode, lots of websites and applications are pretty antagonistic to a tall and thin display. (Spreadsheets are often quite bad, but there are lots of places where people eat up horizontal space for padding or side bars that are suddenly super annoying).

There's a bit of a push to larger, ultrawide 5k2k monitors (e.g., Dell's U4021Q or the delayed-by-a-year-not-actually-available LG 40WP95C) but a pair of high resolution 27s is still a lot more pixel density. These bigger ones also tend to have lower brightness, presumably do to power and thermal targets per display.


Yes.

That setup required even more fussing around with window setups. I also tried one monitor horizontally and one vertically, slightly better. But overall every two monitor setup has a problem in that my best viewing area (right in front of my face) is occupied by the seam between two monitors.

As for viewing more code, I also thought it would make sense, but I'm no longer sure. I ended up using my entire vertical monitor for a single buffer/file only, because splitting it vertically resulted in buffers that were too narrow, and splitting it horizontally defied the purpose of using a vertical monitor. I then realized that I'd rather work with a horizontally-oriented monitor and multiple columns, which provide me with more useful context from multiple files.


I use 3x 27" 5K displays (an iMac Pro and two LG ultrafine 5Ks with the same panel) in an H: iMac in normal landscape, with the two side displays in portrait. It's perfect.

They are all on arms that somewhat encircle my position so it is no issue to code for extended periods on either of the "side" displays.


Yes, three displays might make sense — definitely much more than two. But I found that I don't like to turn my head too much, so I'm not sure if that setup would work for me.


Nice! what scaling do you run them at? I like avoiding fractional scaling, I have two 24" and run them at 2x.

The next step would be 27" 5K for a similar DPI, someday : )


They are 2x. I would prefer 5K over 4K as 1080px wide is a little narrow for some websites these days, but I have not found suitably good (and reasonably priced) monitor for doing that yet.

Also, my old 2018 MBP was at its limit driving them all. My new M1 Max has no problem with them (and would be fine with 3x 5K displays as well), so it's looking more possible now.


I tried comparing using a 75" 4K TV to a 32" 4K monitor and the biggest issue is that you have to sit just a bit closer to the 75" TV then typical living room use would dictate which means moving furniture, etc.

A much easier trick is getting nice monitor arms that let you position your 32" 4K monitors better, it's often the monitor stand that prevents proper positioning. In my case one monitor is hovering a whole 1" back from my keyboard which would be impossible with the stock stand.


Looks cool, but I found that too many monitors is more light radiation blaring into my eye sockets, and I found if causing inability to sleep normally as well as possibly affecting my vision.

I had a 75" TV but the amount of light that thing emitted hurt my eyes if I watched it too much.

I went back from 4 monitors to just two, a 32" 4k one and a 1920p one for screen sharing in meetings with people with normal size monitors. And a smaller TV :)


I guess the idea behind this share is to get people's thoughts. To that end, I'm totally in favor of this honestly. Only reason I won't do something like this is that it's very space intensive and won't fit in my apartment. But if I could I would. Why not? My eyes aren't getting any better with age, and even my costly "good" office chair gets uncomfortable after a couple hours.


The reason this can be superior to desk computing is that it gives you more vertical space. Desk computing removes the lower half of your vertical space because of the desk.

Imagine in the future, the desk surface is also a screen, we can put some supporting apps like calendar, notebook, todo list, etc. on the desk. That will be really nice. In his setup, he can literally add another screen by laying another TV on the floor.


Try using a tablet or any display laid on desktop, it's terrible posture and very painful after a while.


You are probably right. But a small angle to tilt the screen may help if the screen is not a main screen.


I use a single 65" 8k monitor about 4ft away (behind my actual desk). While this affords quite a bit of space, text rendering at small sizes (even with the ridiculous resolution), is simply not ideal. If I were only getting a screen for productivity / dev purposes, I would definitely seek a large monitor instead of a giant TV.


I use a 65" monitor at about 12 feet away for comms, and a 27" monitor close up for code. I'm looking forward to football stadium sized monitors in VR.


Having tried multiple sizes and resolutions, personally I've found 27" 2560x1440 (2K) to be the sweet spot for desktop computing and gaming.

Very happy with https://www.amazon.com/gp/aw/d/B08LCNWQWL


This is a godsend for all of the people that love to read what's on their co-workers' screens.


Half crazy, half awesome. 100% tempted!


This abomination is what happens when you let peasants WFH for way too long.

seriously though I have thought about doing a projector to ceiling so I could code lying down on bed with zero stress on back.


I guess it depends on what you do with it but the cognitive load of the potential situation that would need all that realestate... unsettles me.


How is this different from having smaller monitors, closer to the user?

Does it come down to the distances at which the user can comfortably focus their eyes?


I have a 4k 28" monitor which I can't use at its highest resolution with 100% scaling and a 4k 40" TV that is a few inches further away which I can comfortably use at its highest resolution with 100% scaling. I'm a big fan of the setup I have and suspect I would be equally happy with a slightly larger display (maybe up to 55") a bit further away. Maybe the 28" screen would be more usable with computer glasses. With the 40" screen, I can see fine with my regular glasses.

75" seems a bit extreme, as it would require a few more feet of distance between me and the screen to make the screen fit into my field of view. I could see it as beneficial and quite usable in a situation where you need high resolution with many people viewing it.


for many people (esp as you get older) the longer focal distance will be easier on the eyes.


Pretty much. That and how much your view changes when you move your head.

I’ve played with virtual monitors in VR. Everyone loves the idea of having 80 foot virtual monitors. But, all that really does is make it so you can’t move your head in close to examine the low-rez text like you can with smaller virtual monitors up close.


If I have the budget and the space and wife’s approval, I think the scorpion desk is better and a lot more ergonomics.

https://www.amazon.com/Imperator-Scorpion-Gaming-Computer-Of...



Hah, just looking at this gives me a headache. When I last tried such experiments (5+ yrs), I found that none of the display panels I had access to played nice with sub-pixel font rendering. Also now that I'm used to smaller 120hz panels, I can never go back to 60/75hz.


I think recently I've been fiddling around with my monitor when my real issue is the angle of my arms and posture due to my chair, even if it manifests as being tense and unable to focus on the monitor.


I use a 4k 49" tv on my desk as a monitor. Let's me snap 4 screens to the corners. The neck strain isn't great, though sometimes I will underscan the screen and that fixes the neck strain.


I use a 40” screen about 80 cm away, and it’s almost the perfect size. I think it would be even better with a curve, because text right on a corner (like a terminal prompt) is a bit of a stretch.


regretfully curved UHD 40” aren’t sold anymore. The MMD Philips BDM4037UW was available from dec '16 till summer '19.


Too bad, that hits really close — my monitor is a Philips BDM4065UC.


The Philips BDM4065UC has a better speaker. Conners are difficult to see, in my experience.

That’s why I prefer the BDM4037UW and add an external speaker.


Guy should have better spent some money on nice furniture, imho.


And 8k is coming....

I'll probably get a big 50-55" 8k for the main and try to put my 1-2 40" 4ks on the flanks.

If you're not twitch gaming, TVs are such a better bang for the buck.


Wonder how many years before every wall is a screen that will be installed like wallpaper.


Good luck sharing your screen on a remote call with that.


That's a really tiny chair and keyboard.


No.

Just.

No.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: