Hacker News new | past | comments | ask | show | jobs | submit login
The Best Monitor for Programming: A Cheap 40" 4K TV (masonsimon.com)
285 points by masonicb00m on Nov 27, 2017 | hide | past | favorite | 213 comments

If you enjoy a low density of 110 pixels per inch, then this is for you.

I'm much happier with the high-DPI displays in my ThinkPad Yoga 460 WQHD and MacBook Pro Retina 15".

I don't like pixels, and I don't want to see them any more. I want to see the text and graphics I'm working on, not pixels.

So when I wanted a second monitor to go with those machines, I got a 24" 4K display. At 187 pixels per inch, it's a reasonably close match to the 210-220 pixels per inch on the laptops.

I use this monitor in portrait mode next to the laptop display. A portrait mode monitor combined with the laptop monitor is great. I've got a wide screen when I want it, and a tall screen when I want that. It's ideal for reading documentation and especially PDF files, because an entire page fits on the portrait screen. Larger monitors are not very practical in portrait mode.

I'm not concerned with cramming the maximum amount of stuff on my screen. I run the monitors at 225% scaling in Windows, and the default Retina scaling in macOS. So text is about the same size it would be on a lower-DPI monitor, just much sharper and crisper.

After I got my first taste of high DPI with that MacBook, I swore I'd never go back to a low DPI display. It is so nice to not have to look at pixels any more.

I also use a 40" 4K TV. The vast majority of what I do on my computers is text-based, which means that after the text is easily readable, I don't get any real gain from increasing the pixel density of the same shapes, but I get a lot of gain from having more text and types of textual things on the screen at once (as the article author apparently does, as well).

Until recently, I used a 40" 4K monitor. For whatever reason, TVs handle input switching much, much more smoothly than monitors do: If I have two machines hooked up to different inputs of a monitor, and reboot the one I'm working on, the monitor will often spontaneously choose to show me the other input. A TV in the same situation will happily show "no signal" and stay where you told it to stay. I've only tried two brands of monitor (Planar and Philips), admittedly, but they behave identically in ignoring the manually-set input if some other input seems more interesting. :/ I had been using a Philips 40" 4K monitor for a couple years, and the frustration with input switching drove me to the same Spectre TV from the article, which is frankly worse in every picture-quality-related metric than the Philips, but doesn't instantly switch inputs when signal is interrupted, or blank and reblank the screen 2-12 times when it hasn't been on for a while.

Well, that went a bit off track.

I know it is a new/budget brand, but I've now purchased 3 different TCL 43S405's (work, home, wife's work). It is a 43" 4k monitor, with low lag, low motion blur and 60hz refresh rate. I've been extremely happy with them (once you turn on game mode and turn off HDR). $350 before black friday, they're $300 at Target right now.

Also, if you want a fantastic site to compare tvs/monitors apples to apples, you can't beat rtings.com. They grade every device on the same rubric so it is easy to compare and contrast. You can even tell it which qualities you care about most (or what use case) and they rate them accordingly.


Wow - 15% off of that as well, and 5% with Red Card - picked it up for around $260. Great tip.

If you were good with a 1920x1080 at 24 inches you get then same pixel dentsity with a 4K at 48 inches. I've been looking for a 50" curved TV but might have to settle for a 55" which will be fine for my slightly older eyes. I grew up typing my code on an analog TV so I'm good with any modern monitor. Having said that, 2 24" side by side is nice. Turning one of them into portrait mode is sometimes nice. Having one giant display where I can lay all my stuff out like a real desktop would be awesome.

I have a 24" FullHD monitor next to a MBP 15". The picture on the former is so poor compared to the latter that I only use it for reading email and web browsing, and do all the code work on the 15" high-DPI screen.

Under Linux, situation is not as bad, since you can persuade xft to optimize font rasterization for readability. OSX makes the choice towards glyph shape fidelity, though, even at lower point sizes, which results in horrible anti-aliasing blur.

Possibly related: some Macs render terribly blurry text when the monitor color profile is not to their liking--look for the "sRBG" toggle, and toggle it.

Especially so when using HDMI, Macbook often thinks it's a TV. You might want to google patch-edid

Wow. I wonder if it's a part of common lore which I was somehow oblivious of.

I use curved 55" UHD from Samsung as a TV/monitor; you need to keep your distance from it as being very close the picture is a bit weird (first you need to turn your whole head as it's too much for eye movement only, second there are wide FOV distortions; like when you put your head close to a normal monitor). The dynamic range is much lower than EIZO/LG pro monitors, colors are washed out; generally, it's fine for SteamBox and playing games and as a real-world preview of how a movie/photo would look like on a real consumer device, but you won't get anywhere near pro quality a proper monitor gives you (even when you enforce 4:4:4 mode instead of default 4:2:2). I have 3x 4k displays (30.5", 28" and 55"), can't go lower than 30" now...

I got the Samsung 49" 8500. Highly recommend for coding.

That doesn't come in a curve does it? Since I want to layout the desktop similar to a multi-monitor setup I figure the curve will help to angle the sides in. Do you find it awkward to look at windows far off to the side with a flat monitor that size?

8500 has a curved screen. It is definitely an improvement over a flat screen for looking to the side.

But I would want to sit twice as far away from the 48" screen, so that much pixel density isn't so necessary. (Actually, I have trouble actually picturing working on a 48" screen, the distance would be awkward...)

48 inches is a far off distance from the screen, I think you'd typically be in the 20"-30" range

He wasn't referring to the distance away from the screen, but rather the size of the screen -- if you quadruple the area of the screen from 24 inches to 48 inches and also quadruple the number of pixels then there are the same number of pixels per unit area, which means if you get a larger screen and it is too large, just use a subset of it and you can emulate any smaller-sized screen without having larger pixels.

Exactly. It should be like having multiple monitors. I would never run an app at full screen on that.

Ah, gotcha. I misinterpreted there.

I tried 24"/1200p, 27"/1440p, 27"/4K/150% scaling, 32"/4K/125% scaling, and now settled on 43"/4K/0 scaling.

The DPI looks like 1080 on a 21" years ago again, but the real estate is enomorous and I certainly don't mind it. Sure pictures/movies on 40" is not as pixel rich as a 32" 4K or 27" 4K, but you'd have to mount multiple to match the real estate of a 43" monitor.

It's true though the scaling makes text larger but not reducing DPI as the picture looks much better on a 27"/32" than the 43". But without scaling, my aging eyes can handle anything greater than 110 PPI. https://www.sven.de/dpi/

Just put the TV further away, it has a similar effect to increasing the DPI, but also it reduces the amount of optical accommodation your eyes need to do (ie eye muscles stretching your lens flatter), which reduces eye strain and must surely reduce progression of myopia. 14 hour days with no eye cramp. I think 4k is starting to approach the upper useful limit of detail you would want on a rectangular monitor that you look at without moving your head, it's a nice bump over 1080p. 1080p upscaled to 4k also looks better than 'raw' 1080p.

That would work well if you can always use the TV/monitor at a farther distance.

This wouldn't be an option for me. Although I do usually work at my home office, I also need to be able to use a laptop on its own when I'm traveling or just want to get out for a while. So that determines the 20" focus distance I need.

With single vision prescription glasses tuned for that distance, my eyes don't have to do any of the optical accommodation you mention. My glasses are set up to let my eyes be relaxed at 20"; there's no strain of my eye muscles trying to adjust to a different distance.

Yes, you need the space.

There was a site I was reading some time ago that mentioned with glasses, the geometry of your eyes converging on point nearby (ie slightly 'cross-eyed') doesn't correspond with the eye muscles relaxed, for focusing at 'infinity'. The 'control systems' are likely coupled. I've used glasses with smaller screens, and I've noticed my eyes behaving a bit strange (perhaps seemed harder to focus) when alternating between glasses/no glasses. No idea if there are any other consequences. Maybe it's fine.

I use a 28" 4K and have used a 40" 4K TV for programming in the past.

The 40" had total crap quality but still had some advantages in terms of usable screen real estate.

There are a bunch of 40+ real monitors now though: Dell P4317Q, AOC C4008VU8, LG 43UD79-B, Philips BDM4037UW - they are all using the same panels as TVs I assume, but hopefully the higher quality ones, and their electronics will be designed for computer use.

I wish my vision was so good. I can’t see any difference so I’ll take the low dpi option.

I don't want to repeat the same comment too many times in this thread, so I'll just mention it one more time here. If you can't see the difference between a low DPI and high DPI display, it's time for a vision check and a good pair of prescription single vision computer glasses.

This is like a reading prescription, but adjusted for a farther distance. Reading glasses are typically adjusted for a 16" focal distance. That's much closer than you would use a computer. If you use a laptop on its own without an external monitor, that's usually about 20".

I took my ThinkPad to my optometrist the first time I did this, so we could be sure of the correct distance. When I saw how crisp and clear my screen was with the prescription, compared to the blurry mess I'd been suffering with... if I may indulge in a pun, it brought tears to my eyes.

I can think of another possibility, which is that in order to compare a 1080p display and a 4k display properly, you need to ensure that both displays are reasonably calibrated for quality. TVs out of the box are tuned in a mode I think of as "claw your eyes out" mode, with horrid preprocessing and sharpening and all kinds of other things designed to make the TV pop out in the display room, with no regard what it does to the underlying video signal or your eyes over time. Two displays in different resolutions, both left in this mode, may look roughly equally crappy when you're sitting at home feeding it a normal video signal, rather than a crafted demo. The article alludes to this when it mentions turning down the sharpening, but typically the "mode" the TV ships in initially will do other things too, like pop the saturation up and such.

The first thing you'll want to do with a new TV is to take it out of that mode. The "correct" mode depends on the TV and some internet searching can help you out, but putting it in the cinema/movie mode is at least a first good guess.

Finally, of course, you need to feed the 4K monitor a good 4K signal, or the 4K monitor isn't going to look any better. And as the article mentions, that's sometimes easier said than done, because HDMI can't always handle it.

If after all that, you can't tell the difference at all between 4K and 1080P at point blank range on a monitor setup like that, you may have eye issues, yes, absolutely. And you may still find that you don't much care about 4K. I've got 4K on my laptop, which is nice in a couple of ways, but in general I'm really rather "meh" on it for development work. I've got two 1080P monitors at work and prefer that to a 4K setup. But I also don't care about pixelization very much compared to a lot of people, and never have. I used to read books on my 160x160 monochrome Palm Pilot, and I guess everything looks good after spending my formative years on a Commodore 64 with a 1980s vintage television. But I do at least see the difference and can easily understand why others have other opinions.

Reading books on a Palm Pilot... that brings back memories! The Sony Clie was a great little gadget.

Going to try this! Thanks for the tip.

I can certainly see the difference between low- and hi-DPI, but thanks to subpixel rendering (which unfortunately is going out of style because of hi-DPI displays), it doesn't really bother me - call me old-fashioned, but I'm more bothered by the waste of pixels. 2560x1440 is a pretty good resolution for a monitor, and nowadays it's crammed into the 5" display of a phone, just so that it looks pretty. Ok, unless you use it for VR, then even more DPI would be nice...

I thought the same as my vision isn't exactly great, but you can tell the difference and I wouldn't go back. At least between my old ThinkPad T440 (1440x900) and my 15" rMBP that is.

I went with Acer’s xr382cqk

Enough screen real estate that you don’t feel pressured to add another monitor, high enough DPI to look good, low enough to not need scaling, connects to USB C MBP with one wire for video, charging, and a USB 3.0 hub.

IMO the perfect monitor.

Yep got one too - refurb (buydig via ebay) for $799.

Plays nice with my PC and 2015 macbook pro.

Very happy.

Why would you be against adding another monitor. When I have two monitors I benefit from angling them so that each one is perpendicular to my face as I turn it to look. The larger the screen, the lower percentage of that screen is perpendicular.

I mentioned this in a comment below:

>It’s a 38” 21:9

>I had a 34 inch with a second 27”, but a single 38” hits the sweet spot of not needing another monitor.

For a short while I had the second screen with the 38” screen but it was just overkill (the 38” alone is enough to cover a large portion off the usable area of my desk, so the 27” mostly just sat out of view)

This is why curved computer monitors make sense, and curved TVs don't (multiple viewers in most?/many cases).

A quick DDG for "xr382cqk" says that's a 34 inch, 21:9 aspect ratio curved display.

It’s a 38” 21:9

I had a 34 inch with a second 27” monitor, but a single 38” hits the sweet spot of not needing another monitor

Ah, it seems the copy on https://www.acer.com/ac/en/US/content/model/UM.TX2AA.001 is incorrect - the specs below do say 37.5", I didn't notice the discrepancy before.

Funny because for coding, I'm almost happier running turbo pascal UIs ..

Turbo Pascal was an outstanding product.

yes, despite being vga and much pixelated

I still haven't found anything that can render code as nicely as a good bitmapped font on a non-HiDPI display can look. I do sometimes code on a retina MacBook Pro, and it's OK, but it's nowhere near as nice to look at as a good non-smoothed coding font on my 110 ppi desktop display.

Obviously, different people will have different tastes here.

Exactly! So glad to see the HighDPI fans gaining acceptance. 110PPI looks like garbage and needs to be relegated to the dust bin.

Why do you want to throw out other people's monitors when they're working fine for them? (Source: I also have a 40" 4K monitor at home, both for coding/reading and gaming.)

Wait until you're a bit older. 110PPI and retina display will both look equally good (or bad) to you.

Is 65 old enough? ;-)

Of course if I take off my glasses, then every monitor looks the same. But with proper correction, my vision is better than 20/20.

With my prescription computer glasses, the difference between low DPI and high DPI is like night and day. I can see every pixel on a low DPI monitor, while high DPI monitors look beautiful to me.

Please, everyone: do yourself a huge favor and get your vision checked and get some single vision prescription glasses if they are called for. It's hard to overstate the difference this will make in your quality of life.

Edit: Not sure why you were downvoted for a reasonable comment. It's true that your vision will get worse as you get older. Even if nothing else goes wrong, you lose your ability to focus your eyes dynamically. The lenses harden up and lock into a specific focal distance.

However, many of these vision changes are correctable. Progressive lenses are a great solution for "out and about" use. You can look up and down to see far and near things. They aren't very good for computer use, and that's where the single vision lenses come in.

It is a minor nuisance to switch back and forth between the computer glasses and the progressives, but being able to see clearly makes it all worth it.

Yes, not quite sure who's ire I raised, but I'm in the older quadrant myself and I do need my computer glasses to work.

No, this is a huge myth. I’m getting older (42) and my eye sight is getting worse, and I can still tell the difference between 110 and 220. Heck, I could easily tell the difference between 150 PPI and 220 (a hitachi 28” 4K vs. a iMac retina 5K).

Coding without seeing pixels in font rendering is just bliss, it is too hard to go back.

This sort of long-sightedness is usually correctable.

Though I’m dreading the day I have to get varifocals.

I tried varifocals once, gave up on them immediately. They just didn't work for me, I was always trying to see something through the wrong part of the glasses. Now I have two pairs, one for normal distance and one for computer work.

:) we might have different vision. I can’t resolve individual pixels with this setup unless I put my face near the screen.

Everyone has different vision, of course! If you don't mind my asking, what is your viewing distance to the screen?

One big consideration for me: I'm 65, so my eyes are not able to focus back and forth to different distances like a younger person.

My solution is a pair of single vision prescription lenses, adjusted to the distance to my monitor(s). I need to be able to use the laptop all on its own, so that determines the focus distance: about 20".

Because of that, I also place any secondary monitor at about the same distance from my eyes, otherwise it would be out of focus. I can't go changing glasses to look at a different monitor. :-)

A 24" 4K display works nicely for this, and as I mentioned it's small enough to work really well in portrait mode.

So I have two recommendations for any programmer. If you haven't used a portrait mode monitor, try it out alongside your laptop display. It's a very practical setup.

But more importantly, if you find yourself gravitating toward lower-DPI monitors, it may be time for a vision check and a good pair of single vision prescription glasses adjusted for your monitor distance (again, about 20" if you ever need to use a laptop by itself). I put off doing this for years - I always thought of myself as the kind of person who "didn't need glasses."

When I finally got the prescription lenses, it was a real eye-opener - pun intended!

Yes, for me 1920x1080 on say a 20-inch is the absolute minimum with these LED monitors that are all the rage now. Even the old CRTs looked better with a lower pixel density; I wonder why

The CRTs looked better because there was more "bleed" from pixel to pixel. A great example I found to describe it was if you were in the iOS ecosystem when the iPhone 4 came out, every outdated iOS app looked terrible. The 4 had the same screen size but double the PPI of the previous generations. An outdated app looked better on a <4 than it did on a 4 because the bigger pixels of the earlier phones bled into each other more while the 4 created crisper edges that were incredibly jarring.

The TV mounted on blocks and the Hackintosh suggest the author is willing to compromise and using a TV as a monitor confirms it. Don't get me wrong, I also run a Hackintosh and have had monitors stood on boxes, crates and books over the years, but when it comes to displays I really don't think you should compromise.

I have converted many professionals away from TVs because when you can actually have real quality monitors to compare them too it's usually a no-brainer to abandon the cheap TV. No trick, no incentive for me, just the better choice. It is also why every employee in our company has a pair of IPS displays.

I've looked at 4K TVs as options for a monitor multiple times, from the early Seiki TVs that hit at the start of the pricing change to what's been on offer for Black Friday, don't do it to yourself unless you are spending most of your time gaming/consuming content on the same system or just using it to monitor things.

If you are going to be sat in front of displays for 10,000 hours a year working/focusing (like many of us I am sure) then investing in something like a pair of Dell U3415W, LG UM95 or LG 31MU97Z-B will give you a much nicer experience. Totally appreciate that isn't a $300 solution, but as professionals we should invest in (or demand from your employer) good tools.

To those of you who are using 4K TVs, it is obviously hard to get in front of these monitors and see the difference, but if you can you should - you've already made the choice to go for something better than a cheap 22" 1080P monitor, there are a lot of options to create a great environment to do your best work.

The article is just pure marketing anyway for referrals, so make of that what you will.

You didn't actually say _why_ monitors are better for the author's use case: programming. "A much nicer experience" - how? in what way? is this quantifiable in any way?

Usually a computer monitor is higher DPI, and more adjustable for use on a desk.

This is a very limited sample size but I own and use a 50" 4K monitor from Wasabi Mango and a 50" 4k tv from Samsung. The off-brand monitor's text is quite a bit clearer. I'm not sure if that's just a settings issue, but I have spent quite a bit of time playing with the TV's settings and the text is consistently less clear than the monitor's.

There are flicker-free (no PWM at all, or at least no PWM at mid and hi brightness) models, and you can find (unfortunately that's usually not that simple because vendors are mostly hesitate to specify it) a model with minimal possible brightness low enough for comfortable programming work.

> If you are going to be sat in front of displays for 10,000 hours a year working/focusing

Presumably on a time machine so you can go back and repeat 52 of those days.

Or put another way: there's only ~8760 hours in a year total. Most people (~ 40 hours a week types) work around 2000 hours a year.

Tough crowd.

I dislike multi-monitor setups. Dunno if it’s the bezels or what. A single unbroken surface feels better to me.

For a VC company’s web forum, some HNers sure find making money offensive :D

I'm the opposite, I prefer multi monitor. I like that separation between the things I'm doing, e.g. code and documentation, games and chat.

Right on the spot for me. Having my tasks divided between monitors (and always knowing which monitor to look at for program x) is great. I feel like it helps my workflow a lot :-)

Same boat for me. I also have 1 of my monitors in portrait mode that I don't think I could give up by switching to a single ultra wide

Same here.

I tend to tile my windows and use the different displays for the different 'flavours' of my work.

Same here. Give me one good big monitor with multiple software desktops. (Windows 10 finally implemented them too, though half-heartedly)

No distractions from another screen, just focus on one single task. Also, no turning my head all the time.

Same here. I'm the only person at my company working off a single (27" 4k) display. I'm using Windows 10 so I can switch desktops with ctrl-win-left/right arrow. It's pretty much the first time I've been truly happy with my setup.

You shouldn't treat multiple monitors as a single display.

I've had a UM95 for several years and just bought a cheap 4K 40" tv (Samsung MU7000) to replace it as my primary. They have equivalent pixel density and the tv has nicer colors and contrast with less backlight bleed (VA panel instead of IPS) and more real estate. The only advantages the UM95 has are the displayport input and the screen being more matte.

I code on a pair of Samsung 4K 55" Curved TVs, and enjoy them immensely. What am I missing out on with your IPS advice? I don't do graphical/visual work, and you didn't mention anything specific about why a high-end monitor is better. I am also not sure what use case you're championing, nor what your employees use their expensive monitors for.

Would be curious what your use cases are, and what the differences are that you've experienced vs 4K TVs as monitors. I moved up from 6 x 27" 1080P monitors and like this arrangement far better. No fussing with multiple video cards, fewer bezels, and gained the equivelant of 2 1080P screens into the bargain. All for $1300, 1/4 of the prior setup cost.

>I code on a pair of Samsung 4K 55" Curved TVs

I would love to see a photo of this setup. It sounds simultaneously awesome and ridicules!

Seconded, all I can imagine is that he's sitting like 6 feet away from 2 TVs next to each other in a setup that would be great for racing games.

I'd also like to see a photo of this setup, and I'd like to understand how you work on this.

I have a 27" 4K monitor at 150% scaling and it feels 'big enough', I was considering getting a second but then I'd have the bezel in the middle, so three would probably be better, but then I'd have to turn my head a lot to see the outer screens, so it seemed a bit pointless.

Maybe I'm an outlier, but I prefer to have applications full screen and just focus on that one thing. I also don't work with videos playing in the background as I find that very distracting, music is ok as I can zone out (and then realise for the last hour I've had headphones on with nothing playing), so I don't need to see a whole Spotify playlist, artwork and recommendations when I'm writing code.

If I'm doing web development, I'll have Vim in a terminal full screen, and when I need to test something live I'll switch to Chrome - my screen is big enough that I can have the page and debugger open side by side without it feeling squashed. I have tests and the compiler running in another tmux tab, so the terminal bell will alert me if something needs my attention there.


:) I basically divide the screens into 8 1080P windows. A "basic workday" for me is usually:

1. Chrome with Email, hangouts, and other administrata 2. A second Chrome with web dev. 3. A third 1080P with the web inspector for above. 4. An SSH window to the nodejs server 5. pgAdmin to postgres 6. VSCode 7. Xencenter if I'm juggling VMs around 8. Choice of: another chrome for goofing off, or a bunch of smaller windows with git bash, filesystem windows, or other little distractions like spotify.

It's hard to tell from this photo (lol messy desk) but I sit about 80cm from the screens. There are some downsides:

I'm too close to the TVs for gaming -- they're too bright even at minimum brightness setting. One "all white" explosion cutscene and my eyes will hurt. I'm in the habit of closing my eyes when I sense impending white-out.

The "curved" monitors are only subtly curved, and I still fly them in a "Vee" configuration. I could've saved $200 by just going with 2 flat monitors. It seemed cooler at the time than it is.

I see others talking about fighting firmware and whatnot, but these MU650Ds from Costco (which I'm told are identical to MU6500s, but re-numbered to avoid price-matching...shrug?) just worked out of the box. Setting the "device" to PC HDMI and the TV took it from there. I dinked with the color settings to taste.

It's weird, my laptop is an XPS13 with the 3200x1800 screen, and when I try to do ANY splitscreen work, I find myself edging closer or squinting. I don't wear glasses, but I think I must just be a "not a HiDPI" guy. square inches on screens are cheap and cheerful.

I haven't found it hard to focus on a "tile" to work.

It's nice to step through the node debugger by flicking VSCode fullscreen, hunting and pecking through the 100-line-high code, all on the right side while the SSH/Chrome/Chrome debugger/pgsql windows sit on the left and I can watch them whirr with each step. Works for me anyway.

$0.02 :)

Nice setup. I personally like to have three monitors, so the center one is in the middle and I don't have to turn my head.

ATM I have a fairly old 32" 1080p TV as my middle screen.

It becomes hard to find the desk space for three monitors when you get to a 40" middle screen though. I prefer having multiple monitors than a single larger monitor. I haven't decided how I'm going to upgrade yet.

Our employees are sat 50cm from their 27" 1440p monitors, they sit at 1.2M/4ft desks and look at text for 5-6 hours a day. So in this context TVs are not suitable, and would be never considered for our office environment. Some are free to choose what they want and have 4K monitors, everything is Dell Ultrasharp for simplicity. There is a mix of portrait and landscape used. I'd suggest quality 4K over cheap 4K, but we don't provide that for all our employees as it is unnecessary, but I do deal with a lot of other businesses and it has been very uncommon for me to see 4K TVs in the workplace here in the UK.

I've had IPS monitors for 15 years and maybe it's in my head at this point compared to high end TVs and TN panels, but I wouldn't use anything else nor would I subject those I expect to drive revenue for me to do so. It's a key tool that is often an afterthought.

End of the day whatever works for you, but I have seen first hand people have huge TVs on a desk thinking they are getting some kind of gain and then they look up and down all day long or squint to see whats on it. A lot of people have to be forced to adhere to good ergonomics and healthy use of computers.

Hardware is cheap compared to what we pay for software, SaaS, pension and perks for our staff so it's something we are good on.

/ramble over

A 4K 40” screen has similar dot pitch and is smaller than a pair of 27” 1440p screens. The main advantage is it has no bezels so it’s more flexible.

Also, many TVs are 60Hz IPS displays, but you have to do some research to find ones that display in 4:4:4 color. After that the main difference between the two is the firmware on the device, some things like VESA power saving work differently.

TVs can be cheaper because they have larger economies of scale than monitors. It doesn’t have much to do with the underlying technology.

It seems like you don’t know as much as you think you do about display technology.

I tried the Samsung 40" 4K monitor route and the worst part about it (other than being a tad too large) is dealing with the "smart tv" software. Immensely annoying.

What was annoying about it?

Cool, I'll continue to see us purchase quality, proven, monitors as we expand and require more hardware rather than start dropping large consumer TVs on peoples desks.

27" @ 1080p? 55" @ 4K ?

These all sound ridiculously low dpi - mid 80s.

My 17" pre-retina MBP does 1920x1200 - 130-something dpi and that's so old Apple don't sell parts for it any more.

How on earth are you using these or the previous ones, that everything, and particularly text, isn't just atrocious to look at?

Oh how I wish the 16:10 aspect ratio hadn't been abandoned. Every time I get an urge to upgrade my home monitor, I remember it's 1920x1200 and I'll never find another like it.

Last time I bought a laptop, all the reasonably priced ones had 1366x768 displays. I hate it! The pixels are very visible. Finally picked up a 14" 1920x1080 Chromebook for web use and I couldn't be happier, it's so much nicer looking.

> Oh how I wish the 16:10 aspect ratio hadn't been abandoned. Every time I get an urge to upgrade my home monitor, I remember it's 1920x1200 and I'll never find another like it.

A kilobuck will get you a 30" 2560x1600 Dell UP3017.

Oh I know they're still out there. But as you've noticed, the price makes them totally impractical. The lack of selection also means you're less likely to find the exact monitor to make you happy.

16:10 is the best aspect ratio still to me as well. I have a hard time upgrading my 24" Dell 1920x1200 Monitor. I refuse to have the limited vertical space of a 16:9. Sure it's just 1 aspect ratio off, but to me it makes a huge difference.

I'd guess from the fact that he's using two 55" displays he's sitting a good distance away from them.

Distance from the display is as important as DPI when discussing pixel clarity.

It does say "I code".. how far away do you have to be for text to appear crisp at 80dpi? And how big do you need the font size to make it readable?

27 hours per day is a lot of screen time.

lol yes it is, I mean't 2000 - no idea why I wrote 10,000

I've told you a million times not to exaggerate!

Don't buy a cheap TV for programming if you're not constrained by money. You can get 40" monitors for a bit more, and you won't need to fight the firmware and also you can get a better panel. Also, PWM is worse if you're sitting close. Uniformity of backlight (bleed) also matters.

Currently the display you want to get around 40" is the Philips BDM4037UW, which has no PWM, relatively low input lag and backlight bleed is low.

There are some cheap 43" 4K monitors (Philips and LG), but they're lower quality e.g. LGs have PWM.

Just as an alternative - I have one of these[0] iiyama panels after hearing bad things about the Phillips.

Have to say I'm very happy with it.

No ghosting, really excellent colour (once calibrated), minimal backlight bleed (it's not an IPS panel so has less bleed anyway) and 4k at 60hz.

Been using it for just over a year now. Only thing it does is pure blacks on pure whites can streak a little when scrolling. So very large and thick pure black text on pure white background will leave a very short trail. Normal text doesn't do this, it has to be large areas of black on white. It's actually not annoyed me much as it use this happened only very very rarely on some websites but it's a limitation.

Not sure it's best for gaming (though I've played Mass Effect and Rocket League with no issues) but for productivity work it's been great.

Pretty cheap too!

[0] https://www.amazon.co.uk/iiyama-X4071UHSU-B1-ProLite-MVA-Mon...

Reviews on Amazon suggest the Philips BDM4037UW isn't a good choice due to significant ghosting issues:


Big caveat emptor on this one.

I have it on my desk right now and ghosting is really, really bad. Static contrast ratio is best in class, second only to oled monitors.

If you are looking for large office/work monitor only or you can borrow one with no strings attached, go for it. Otherwise stay away.

Yep, I specifically meant it for programming / office work.

I jumped on the Philips BMD40 train immediately at launch for £600 a few years back, the ghosting is an absolute nightmare.

A huge boon for productivity having come from multiple 27" 1440p monitors before, and the BDM40 is something I might recommend for somebody looking to get in and get in cheap.

I'll definitely be splurging for something better next time however.

LG has supposedly increased the PWM frequency of the 43UD79-B from 120 Hz to 480 Hz through a firmware update, which should eliminate the flickering (http://minkara.carview.co.jp/userid/2475954/blog/40499741/).

There’s also the latency between the input coming over HDMI and actual rendering on screen. Some of the cheaper TVs are in the 70-150ms range, and you will notice that when your mouse keeps moving after your hand stops.

I second the author's recommendation. I work with iMacs (5k) all day and I can tell you there were a number of times I simply wished I had gotten an iMac Pro with a bigger screen instead. However, one thing to watchout though when getting these cheap TVs is to be mindful of the color reproduction. Most of them will enhance the colors to make it appear vivid and pleasing to the eyes. So, if you're thinking of doing stuff like logo design, Photoshop, you're gonna find it very hard.

Good point. I mainly do programming so color fidelity wasn’t a big concern.

I've been using a philips 4k for a couple years now. previously the 40" and now the 43" (the 40" died in a terrible VR accident)


it's 4:4:4 chroma at 60hz so it's great for gaming too. and has a lower response rate than you'll get with TVs.

Highly recommended and it's hard to use anything smaller now.

Without using display scaling I wouldn't go lower than 40" and I think maybe even closer to 50" is ideal in terms of keeping the scale comparable between other periphial monitors. a 48" curved display would probably be perfect.

Many, too many BDM4065UC monitors suffer from "flashing" black and then, more than occasionally, starting back at 30Hz instead of 60Hz (dp 1.2 connection). Fixing that requires 1-2-3 monitor power cycles, and when the display is in a "bad mood" it can happen 3/4 times per hour.

Source: I own and use a BDM4065UC which I'm about to return, and I've been able to test another (previously owned) one, where the owner, who had a totally different setup from me (mbpro retina 2015 is my workstation, his own was a custom-built desktop with nvidia card); many other people on various forums complain about that, and Philips support seems to know about the issue perfectly.

I have been interested in BDM4350UC and/or BDM4037UC as well, but both have been reported to suffer from extreme ghosting/persistence; I'm now looking forward to buy an LG 43UD79-B.

What I agree is: a large monitor makes the difference. Not in the workflow per se, but in how much eye strain, headache, back pain (from slouching to get closer to the screen) and so on you get at the end of the day - which is almost inexistent with such large displays. Even if it's $700 for the LG, that's well spent money.

I've experienced similar with the first BDM40, it dropping the DP1.2 support and needing to open up the OSD to revert back to DP1.1 and again to DP2.2 to resolve, in addition to extreme ghosting.

Please update if you do end up investing in a new replacement, the LG especially.

Huh, I guess I'm lucky then. The only problem with my BDM4065UC is that at the upper left corner, the leftmost column of pixels is obscured by something (as if the panel was bent slightly backwards at the edge).

Possibly it's a problem from a batch of them, but such batch seems rather large. I don't like that myself, because the BDM4064UC would simply be a great monitor for programming work at that price point.

That one looks nice. Good port selection. Thanks for the link!

Terrible VR accident... punched it while playing Superhot? ;)

GORN. but yes.

I actually prefer having dual monitors over one big monitor. I dedicate certain tasks to certain monitors, and that divide helps me quickly find what I need.

It's also nice to have the physical separation for some reason I can't really describe. My brother has a similar setup to the one in the article, and I just never really liked using it.

Just personal preference I'm sure.

(I also fear my eyesight might be too bad to have a big monitor in 4k, with things being rather small text)

For my setup, I prefer having (bright) IPS monitors. Not too large, I'm fine with my 24" 1080p monitors.

Why are 4K TVs cheaper than 4K monitors (same size) ?

Something to do with marketing and how the two businesses differ (TVs B2C, monitors B2B)?

Or are there actual major differencies in the tech? Like what panel is used or something like that?

Economics of scale - many more TVs sold than computer monitors.

Add to that the different features - higher DPI, refresh rate, contrast ratios, etc. Cutting edge features such as NVidia G-Sync and the similar feature from AMD don't come cheap.


Not sure if this is still true, but historically, TVs have had dramatically lower image quality than comparable monitors.

Seems very similar to "4K is for programmers" which was written when 4k monitors/TVs had just been released.


I, personally, own one of those TVs (Seiki 39") and both love and hate it. I've always seemed to buy TVs for day to day computing because you generally get more cutting edge features for better prices (My first flat screen was a 27" WestingHouse TV. First 4k, Seiki. ETC).

Economics of scale factor in heavily - more TVs will be sold than computer monitors leading to better prices. Which is why I could get a Seiki 4k for $400 when similarly sized 4k "Desktop" monitors were going for $1000+.

The most annoying thing has been "Turn of monitor, Computer considers it unplugged". So you go from 2 monitors to 1 - and all the icons rearrange. Computer goes to sleep? Monitor turns off... disconnects... Annoying.

The price of saving $600+ for 4k (At the time).

I've recently upgraded to a smaller "computer" monitor that's better suited for gaming and desktop use... and the Seiki is now my computer room Plex Player.

Funny, I was just pondering this yesterday: whether to "upgrade" to the LG, Phillips or Dell 40+" IPS monitors, or whether to stay with my Samsung 40" TVs (UN40JU6400, 2015 models--later models have annoying misfeatures), which I have in all 3 home offices.

These Samsung TVs, once configured properly (Source > HDMI 1 > Tools, select device type as PC, turn off HDMI UHD Color (which blurs), turn on Dynamic Color as well), are excellent, and at the 28" viewing distance I use, are absolutely crisp (no pixels visible) in Retina mode under macOS.

And I'm using it over HDMI 30Hz with absolutely no flicker, and no (discernible) mouse lag.

The 2015 models don't lose their mind when the Mac goes to sleep, unlike the later models, which require re-configuring each time (the TV "forgets" the HDMI port).

Can't recommend them more, especially since you can get these older models refurbished for $400.

FWIW I have the 6290 (2016 model) and haven't run into any annoying misfeatures. Crisp and smooth at 60Hz, UHD color mode. Was stuck at 30Hz and perceptibly laggy until I forced it to 4:4:4 in the nVidia control panel.

Everything online tells me the display lag is unacceptable, but I haven't noticed it personally even when FPS gaming. Maybe it's the conditioning from gaming on an underpowered PC growing up :)

The UHD, HDR and Movie Modes for most of these tv/monitors seem to modify the picture a little too much. These are usually the first to get disabled when I'm using them for desktop use.

For a compromise, there are now 32" 4k monitors meant for computer use for around $300 if you look for the black friday and cyber monday deals: https://www.reddit.com/r/buildapcsales/comments/7ehzic/monit...

40" is roughly 1.5625 times in area of a than 32" though...

That's a VA panel. Is that really acceptable?

What's wrong with VA panels? Beats a TN panel any day.

> Beats a TN panel any day.

Yes, TN is even worse. One must have an IPS panel! :-)

(I hear, though, that TN, and maybe VA, might be preferred by some for high-performance gaming situations, due to a lower latency.)

The article states:

"You need to elevate this beast vertically otherwise you’ll be craning your neck down to look at it."

A 40 inch TV is at least 50cm high. You should not be looking up - it's very bad for your neck. Looking down is no problem. The upper edge of the monitor should be at eye level.

I read that this is because we evolved as hunter gatherers, walking and scanning the ground ahead for possible food. Our optimum standing posture is looking straight ahead or slightly down

I use a 40". If the middle of my monitor is just about eye level, I don't find myself moving my neck vertically very much -- eye movement suffices for vertical travel.

If you have the opportunity, though, monitor arms are one of the best ergonomic investments you can make. Being able to reposition the monitor with a touch for whatever position or attitude you want helps reduce fatigue enormously.

I was briefly using 43" screen on desktop, but switched to smaller. The screen was too high for me. Parts of it went mostly unused, since it was inconvenient to "look up".

Now I'm using 32"4K and this is the sweet spot for me. Not too big, but large enough to use without scaling (I'm running Windows).

I wonder how that TV performs for "monitor" tasks like automatic input switching.

I have a Sony x720e for my macbook for work and my PC for light gaming. It has graphics and gaming modes that have low input lags, but there is a really annoying quirk where when you turn the tv on it has normal (high) input lag until you switch to another mode then back to graphics mode. This means every time you turn on your PC (or plug in your laptop) you have to:

1. Turn on the TV

2. If necessary, switch inputs

3. Options -> Scene Select -> up arrow, down arrow

1 and 2 are annoying since monitors do that automatically but the TV doesn't. 3 is really annoying but hopefully is fixed in a firmware update...

Other than that the screen looks great.

I'm hoping for time to hack on an arduino with an IR LED to automate those steps, haha.

Offtopic, but this is very interesting. I have this sort of problem as well, but with color rendering. Not on a Sony, but on a Philips (50PUT6400). Makes me think that they might use a common platform for their entry-level TVs.

It doesn’t do automatic input switching. That’s a definite drawback. For the price though, I’m not complaining.

I've been using a 55 inch 4k TV [1] for the past year. Its a matter of preference, but I prefer more usable space over DPI. I have found at this resolution pixels are small enough that they do not bother me. It takes some tweaking, but if setup right, text quality is indistinguishable from a "real" monitor. Its the rough equivalent of 4 27-inch 1080p monitors arranged in a square.

To get an idea of the size, here is a picture of me sitting next to it. [2]

[1] https://www.amazon.com/gp/product/B01EV2094Y

[2] https://imgur.com/a/9uBCG

I'm curious about the mentions of Hackintosh. As I understand it, running macOS on non-Apple hardware (the essence of a Hackintosh) is a violation of the EULA.

I'm not here to argue the pros and cons, and I am not a lawyer, but I'm intrigued about the author saying -- and I paraphrase -- "I'm doing this on a computer that's breaking the licence agreement". That's an admission. I'm not judging, it's an observation. Hackintosh admissions on a personal site where they can be traced back to a person seem a bit risky to me.

I have no knowledge of Apple legal circling. Perhaps I'm just paranoid.

EULAs in general are somewhat of a farce. Most companies use them as scare tactics but wouldn't want to risk them being scrutinized in court, especially when the only damages they could point to would be lost revenue in small-claims territory.

My guess is that Apple's lawyers are busy enough without worrying about individuals violating their EULA without harming their business (e.g. by selling the Hackintoshes). As long as you're not Psystar I think you'll be fine.

Orz mode on. :) Items in * are poorly translated and not to be taken literally.

(edit, lol, fail. Didn't realize asterisks cause italics. Sigh)

From my experience with Hackintosh, you're downloading a canned VMWare image of a pirated OS. I'm not exactly sure when they would say you agreed to anything when you downloaded it, since you only have interactions with VMWare Player (or VBox) and the site hosting the VMWare images, plus a few of the BIOS patches that are needed to make it all work. you never interact with Apple in any way.

The guy who keeps packaging it is running a risk of legal action when Apple gets their fill of PCs running MacOS, though. In the meantime, though, it allowed me to shelve my Mac Mini for iOS development and keep everything in one PC, albeit one virtualized environment.

Being able to copy/paste between PC and Mac is fantastic for workflow.

I am a big fan.

I probably get burned by something deep in the Apple Developer Program EULA where I agreed not to do unintended things with Apple software. :)

I have to think that most people who build Hackintoshes are already Mac customers. I'd guess a typical use case is somebody that already owns a MacBook Pro and wants a decent companion desktop which Apple doesn't really sell at this point.

It's still a violation of the EULA, but I think most violations would be substantially different from, for example, using a cracked version of Windows, in that most Hackintoshers have probably already given Apple $2000 for another computer.

I'm much happier with 4 small screens than a big one. 4 screens means I can do 4 x full screens rather than having to arrange my windows in a big sandbox. I also prefer having them in a long row, rather than 2x2, I put the code on one side and emails on the other, so that I don't see notifications when I code...

With a single 4K you can put 3 1280 images across, which IMHO would be good for web, code, email. You also have the option of stretching vertically - sometimes portrait orientation is better. If you don't like arranging windows in a sandbox, you may want to either have a login script to open your common stuff and place it, or try a tiling window manager.

In these cases, a tiling window manager would help.

I think best is a relative term.

For me, as long as my code is readable, it doesn't particularly matter. I'm not doing graphics 95% of the time and when I do graphics, it only really becomes necessary to have a high contrast, high pixel density screen. I run dual Samsung 2770HD TVs at home as my monitors, they sit 2ft from my face and they're perfectly adequate for me and the price was right. I guess if you're anal about picture perfection, then you're going to have a very different opinion of what is "best" and mine are certainly a loooong way from the "best."

When I watch a TV show or movie though, I don't get too hung up on the picture quality to just watch the movie and enjoy the story line. Of course, I grew up with a black and white TV in my room until I inherited the colour one that we had to tune with a dial and you got some snow interfering with the picture if the aerial wasn't just right. Oddly, it didn't spoil my enjoyment of the show or movie that I was watching even slightly. The story line is way more important than the picture quality.

For coding, the story line is the same... it lives in my head. The screen is really just a means to share that. I get no less enjoyment coding on a screen with a slightly lower pixel density than I do coding on the latest retina. Even after I've sat in front of one that's truly high quality, it doesn't take long to adjust back to a lower quality screen.

Since people seem to be lukewarm to the TV suggested by the OP, I did an Amazon search for 4K TVs under $600:


Anyone have any experience with the top TVs on this list? The top entry from TCL seems to be the leader, by far, in terms of average customer review, but the price point seems almost too good to be true: $379 for the 55' version, though that may have been a sale price, and the TV is currently out of stock.


I have the 55" TCL and it's great! Reminds me of the early Vizio days where they had to be priced closer to production cost in order to compete with bigger players. My one con is that the HDR doesn't seem perfect. The blacks don't seem "true black" and based on the comparison lower on the page I wonder if this is because it's not "Dolby Vision HDR"

I have the 43" TCL 4k. I highly recommend turning off HDR when using it as a monitor. Up close it kinda creates this ghosting effect where black text on a gray background will get this light halo kind of thing, ugh, it looked horrible. once you turn it off it is much nicer to look at.

Interesting that you mentioned HDR because I had assumed it wasn't a feature (seems like most TVs that have it mention it in the title). Is it now a standard feature for HD TV?

(I haven't bought a TV since...ever, no that I think about it)

this site does TV review with a category for monitors, this is their current best pick for 43


definitely not as cheap, but still way cheaper than a comparable monitor.

And that tv is actually on sale for a pretty good price at bestbuy right now (and I’m sure everywhere else).

Samsung MU7000 40" is also excellent.

yep! it's the standard now for new dev boxes where i work. the samsung mu6300 40" is the best option from our experience. how to config:


stay away from the LG TVs. they don't have true 4k RGB.

I had a 28/27" 4k monitor, but like the author I found it far too small (the pixel size) to use comfortably. Unfortunately I was working with legacy applications that didn't support scaling and couldn't adjust the DPI. I have no doubt it would be much nicer at 125/150%.

One thing that is certain is that people have different preferences. Certainly some people would enjoy the tiny pixels of the 27" 4k monitor at 100%.

My current setup is 2x24" 1920x1200 monitors. I have considered 3x24, 1x Ultra wide or 2x Ultra wide, but I'm not convinced I'd get the same comfort. I think that 3x 24, or 2x ultra wides would require to much head rotation to be comfortable.

I prefer the vertical break in the screen since it makes snapping windows easier. I'm starting to think that I might prefer a bit more real estate, as I often collapse the vertical file menu in my editor for a little extra room.

Went from 3x 1920x1080 monitors to a 43" 4k IPS display a little over a year ago and has been great once used to it. Tiling 6 apps using divvy works nicely.

Lately I have found myself wishing there was a wider format 5760 x 2160 screen of about 37-43" for 6 full HD screens-worth. But can't have everything I guess.

Agreed -- I find that a single large high-res display works better for my purposes. 32" at 4K is exactly right for three side-by-side 1280xY windows, which means that browsers will display full desktop-sized sites. Text isn't too small with the 32", so no scaling required. Divvy makes window management much faster, with little mouse interaction required.

I have two 27" 2560x1440 displays at work, but I find it counterproductive to look side-to-side constantly to see what's on the other screen.

I use a 3840x1600 monitor, not quite there resolution size, but in exchange you don’t need to use DPI scaling

Interesting enough, over the summer at Microsoft, they were actually switching out for the 4k TVs. It wasn't that bad tbh. Definitely different. The Surface Books couldn't handle it very well (they didn't have GPUs) but the work stations did great.

I just see an article to push an affiliate link for a pretty bad tv let alone the best.

3 affiliate links actually. It’s probably not the absolute best, but nuance makes for tedious writing. The general pattern of a 40+ inch 4K display is definitely worth a shot. If you find a better one for price/performance, I’d love to hear.

I've been using a 100cm/39" Seiki 4K for about 3 years and wish for a physically smaller 4K screen. Using eye glasses with progressive lens I have to move my head too much with this screen and after long periods of use I find that I have neck strain.

I love the 4K pixels but 100cm/39" is simply too much surface area for a monitor used for primarily small text. My next monitor will probably be a 69cm/27" 4K and hopefully OLED if they fix the burn-in issues that make OLED unsuitable for computer use.

Interesting. I used the 39" Seiki for a couple of years and moved up to a 49" for my next monitor, but mostly just so I could push it further back on my desk.

I've been using a 44" 4K TV for around two years as my main monitor and it's been great. The only very minor pain point is having to manually turn the TV off and on via the remote in the right order so that Windows 10 doesn't reset my window layout. The Visio TV I use has 1 of the 5 HDMI ports set aside to not route through its image processor which means text stays very clear.

My biggest concern is that TVs generally don't have a good refresh-rate relative to monitors. Have you noticed any lag/annoyances in that department? Do you have a link to the TV you use?

I just got myself a TCL 43S405 43" as a monitor. It supports Chroma 4:4:4 and I'm really happy with it so far. Target has it on sale for 299 right now: https://www.target.com/p/tcl-43-4k-hdr-120hz-cmi-roku-smart-...

I've been thinking of going to 4k for a while, just been waiting for prices to drop :) I'm currently using 2 24" dell 1080p monitors, and I have to admit the low DPI is getting to me and I want more screen real estate in general -- I can never have enough code listings open at one time :).

OTOH, upgrading my home setup (which I use for side projects, etc.) will only make my work setup seem worse - they gave me 2 22" 1680x1050 monitors. I've only been here for 2 weeks and haven't been given any actual work to do yet, so I don't know how annoying that low resolution is going to be (but I'm sure it will be). Not nearly as annoying as the loudly buzzing/rattling vent and the headache-inducing blazing fluorescent tubes directly over my head, I'm sure (though, combined with the gray carpet, the gray walls, the gray ceiling, and the lack of any and all natural light, I must say, they've really nailed the office environment from "Joe vs the Volcano")...

I have been using 3 inexpensive 4K 40” TVs for my home computer for a year now. I love the insane amount of real estate and the accessible price (<$1K is great).

I’d love it if someone made extremely large format displays (100”+ at 3:1 aspect ratio) for computer use. Not sure how large the market would be, but one giant display is nicer than 3 separate one so.

I’m all for crisp text however I can get it and avoiding any eye strain and squinting. I’ve got a AW3418DW 34” ultrawide to avoid needing two monitors with a bezel in the middle, and I also use a 24” 4K in portrait mode on the side for reference material (IPS because viewing angle matters when you go into portrait mode). Ergonomics and good habits matters a lot more beyond this point though.

My rule has historically been to spend more on the peripherals like chairs, desk, etc. than a computer because my body is a lot more expensive than my computer and accommodating my body’s needs comes first. With that said, I use a $60 medical stool per spine surgeon’s recommendation and I have a $80 IKEA desk to support $6000 of hardware because I don’t get much more utility beyond a certain point and because I move so frequently a heavy desk would be a bigger problem for me than helpful.

I have been using a 4K 40" TV as a monitor for maybe two years now, it's great! Windows tiling managers work really well on a large monitor. I keep thinking a curved TV would be better suited though.

Also, don't forget to turn on a possible 'gaming mode' (on the TV) which significantly improves the input lag.

Here is screenshot of a layout I use https://camo.githubusercontent.com/79de6fc01f5370345d4b9c4c9...

The above uses XMonad and a custom layout available here https://github.com/chrissound/XMonadLayouts if anyone is interested.

I actually did this for a couple of years but recently switched back to a dual setup. The dual setup is just better for web development IMO. One screen for chrome dev tools or the website and the other screen for sublime.

Also the 4K TV I was using only had a 30hz refresh rate, really bad for eye fatigue.

In the future, people will be able to use VR for programming work.


This is a modern project which is attempting to pull this off in Linux.

It isn’t clear that headset VR will beat eventual direct brain connections or the singularity where programming is no longer a human activity.

Quick question for those that use a 40" monitor: how far away do you sit from it?

Let me measure it... 77cm (30.315") from the center. 42" TV.

About 20" face to panel.

That only goes if you're willing to fight the firmware all the time. Bonus, crap colours if it's a cheap TV. Doesn't matter for programming, but if you occasionally do anything else besides shuffling text around...

I thought this would be the case, but I have been really pleasantly surprised by the quality. I watched some 4k60 videos on YouTube on this thing and thought they looked great. I have not done any gaming with it though. A photographer or graphic artist could probably find fault, but for me, it is very satisfactory.

So I said until i had an older cheap monitor and an IPS monitor side by side ;)

Tech specs are an easy way to compare stuff, but they don't always say everything.

Why only 40 inch - get whatever fits on your desk. 4K monitor will not be painfully pixelated up close. If you coded through the past decade, you will likely see an improvement from some of your recent monitors. At the end of the day, it's about getting the work done, not DPI competition. Although you may use only the center for an IDE, sides are perfect for e-mail, calendar, IM, device emulator, stackoverflow, other source files for reference. A quick side glance is way better than moving windows back and forth. I wish I had attached side panels at an angle for more space.

There is some really good discussion in this thread about the ergonomics of monitors. As a sysadmin, one thing I've learned is that ergonomics of workspaces, at home and at the office, are far too often overlooked or compromised on, so please, for your own future health, take a moment to review how your workspaces affect you physically. That means monitor setup, keyboard and mouse position, chair height and back support, etc.

For those who have been neglecting these things, a good stretching routine and daily posture work is probably the best starting point.

Old guy here who disagrees. My neck/eyes get tired moving over that much real estate. For me, work spaces work better. I find it helps to use the same number and place stuff the same way. I use 5.






Been doing that for years and I can find stuff fast.

And I find my 13 inch laptop too large :) I loved the 11 inch Macbook but the res was too low; 1920x1200 as I have now is nice, but 11 or 10 inch screen would be better.

Don't sit too close to the TV, you'll hurt your eyes.

As long as you are at least 1cm away, your eyes should be fine.

As long as your eyelashes don't touch the display, it's ok. Some people have cut their long eyelashes to be able to get a larger field of view in Virtual Reality...

I've tried so many monitor setup's for programming and I always ran out of screen space. Went from 1 24" 1080p to 3 and finally landed on 6 and I now feel like I have enough room. A side benefit of having 6 monitors is that it looks damn cool. Reference: https://i.imgur.com/rjO3bLw.jpg

What are you putting on 6 (!?) screens when you code??

Does look cool though.

Looks pretty sweet. How useful is it though?

The stands on a 4K TV suck compared to a monitor. There is no height or tilt adjustment. They also generally have higher input lag.

I've been using a 4k 49" TV as a monitor for a few weeks now. It was just a lazy stand in for my defective 27" monitor at first, but it's pretty convenient. Lower DPI is the only thing that isn't too great, but now that I have this setup, I think I was overrating DPI. I'm thinking I'll just stick with this.

I like that with a monitor that big I can push it to the very back of my desk, freeing up a good bit of desk space.

Me too! I feel like it's a really versatile setup, and definitely de-cluttered my desk at home.

I actually prefer sharp pixels when programming. Seeing the pixels doesn't bother me. But blurred pixels does bother me.

TVs have very different matrices compared to monitors (and mobile displays too) in physical pixel layout. TV matrices tend to have much larger "border width" and colored cells more prominently spread, which pushes closest optimal viewing distance further. I personally find text on TVs very tiring for the eyes.

If you're ok with switching to linux, or are already using it, a tiling wm is easily a better investment (5 mins to learn, orders of magnitude more space and time efficient) than buying a big monitor with low dpi etc.

A bad 720p laptop with a tiling wm has more available screen real estate than a 4k monster regular setup.

... and tiling WMs make multiple monitor setups more usable as well. Xmonad is probably the best as it allows any virtual workspace to end up on any monitor more or less instantaneously. You can effortlessly juggle a whole whack of virtual workspaces to three monitors. You quickly end up with any possible combination of things you could possibly want in a virtual workspace and you don't have to touch the windows themselves. You can do everything by flipping between the workspaces with everything exactly as you want it and no pointlessly wasted screen space for stuff like window decorations.

It has gotten to the point where I feel I am watching something quaint and retro when I watch someone manually moving windows around on a screen.

... how?

Simple, most other window management paradigms are terrible. The author agrees:

> On a laptop screen or normal-sized desktop monitor, I end up putting these different windows on different macOS spaces, but then half the time I swipe the wrong direction to switch spaces and end up going to the wrong space. By the time I get to the correct space my short-term memory is blown and I’ve forgotten what I was doing.

I've never had this issue on a tiler. You get 10 (maybe 20) screens accessible with a single key combo. You can thoughtlessly toggle back and forth between any two as fast as you can blink. "Swiping" between a few desktops sounds terrible by comparison.

I'm do embedded development, PCB CAD, some graphic design and manage a fleet of servers all from a single 10" laptop screen thanks to tiling.

Unless you were asking how a 720p screen can have more real estate than a 4K screen? 4K is 3840x2160 or 8.3M pixels. 1280x720 is 0.92M pixels. So with 10 virtual desktops the 720p screen gives you 9.2M pixels. Slightly more than a 4K screen.

You also get the benefit of not having to move your eyes or head from one corner of the screen to the other. What you are working on is always front and center.

Does anyone use a projector for this? I have occasionally used a macboook monitor and 10' projector + screen(made of Fomrica countertop) as a duel screen setup, but I have always wanated to rig up some kind of multiple mini-projector office setup with duel or tiple screens.

You could also get the Pimax 8k VR headset and position your virtual screens.

Thanks, I think the thing I like about the projector is for me I notice much less eye strain

I have a 50" Sharp (well, made by Hisense) TV I got for C$469.

99.8% sRGB coverage and 81.3% DCI-P3 and very little input lag. It's also a Smart TV. So far, I'm happy with it. I run 100% scaling and don't see any pixels when I sit far enough so I do enjoy it.

Televisions aren't optimized for interactive use. They often have a very high latency path from input to display, often over 10x that of the same panel used in a monitor. Depending on the application, this might be a showstopper.

I tried this and 40" was just too large. I suppose if you get yourself an extra wide desk but the regular IKEA desk and height adjustable Multitable I used it simply too close to use comfortably.

I'm holding out for my Pimax 8k X and infinite size 4k monitor.

Monitor legs on each side drive me crazy, because that's where my mouse wants to go. There's no mouse in this picture.

There’s an Apple trackpad in the middle between the split keyboard.

Thanks, I thought that was a paper notepad.

40'' is too much. it just makes it painful to push your mouse and windows around

Get Divvy and never waste your time moving and resizing windows.

Hmm. I’ve been using WindowSpace for years for snapping and centering, but it doesn’t support customization beyond sizing sides and corners. I will check out divvy.

I use a gaming mouse, it is very quick and precise. I can move 1 pixel if I need or move to the other side of the huge screen with a single movement.

I used monitor 27" and it's large with me, how about 40"

Anyone spot any UK cyber monday deals with this in mind?

Anyone know what that keyboard is? I need that.

Meanwhile most users are using a mobile device ...

To each his own, I guess. In my case, I find advantages to both so much that I routinely switch between them.

My main workstation is connected to a 55" Vizio HDTV (a model that supports Chroma 4:4:4 and represents text of various colors very well) that I picked up last year during a post-Thanksgiving sale. While I love it, I do the lionshare of my development using a 1080p display connected to a laptop. Is the 4K screen superior? Absolutely -- in almost every way -- except for portability.

I made the switch from multiple 1080p monitors to a single laptop monitor a few years ago and ditched the multiple monitors for the single 4K (which had a total resolution about equal to my multi-monitor arrangement). The transition was tricky at first, though, and coupled with a visual studio extension that aided in navigation/visual identification of important code points (something I wrote for myself), I found downgrading to the 1080p single display to take far less time to get used to than I expected.

The one drawback turned out to be a positive thing in the long run: by not being able to see so much code on-screen at all times, it forced me to write code more carefully. Where I could have gotten away with not splitting a class up into smaller pieces, before, because I could see the whole thing on-screen, I couldn't do that any longer and continue to be productive writing code. Sticking with small methods that "do one thing" and small classes that have a single purpose is a best practice and it's one I was pretty religious about before, however, there are times when it's necessary to "get the code out the door" and corners get cut, mostly out of a false-sense that writing it dirty will be faster[0].

There's one place where the 4K screen is vastly superior, though -- debugging. You can get your code, the locals/watch and the application visible in one screen. This, too, turned out to be a double-edged sword. It was easy to skip seemingly unimportant unit tests when I knew my debugging environment was pristine. But the reality (for me at least) is that the moment I have to hop into the debugger and check locals/watch to see what's going on, I can expect to waste an hour fixing a problem 80% of the time. Here's the thing, though, writing tests to cover my code rarely takes more than an hour or two but the simple act of doing so causes me to rethink the problem from a testing standpoint. I'll often find the bug before I've ever run the test to validate that it passes -- and about 1/3 of the time, that bug was subtle. Less often, but still frequent, I'll discover during the act of writing the test that the code could be refactored in a more logical manner, which results in better code. These facts mean I write tests regardless of having a great debugging environment, and I'm obsessive about it -- covering code that seems obvious[1] -- so I end up rarely needing the benefits that a 4K screen offers.

Being handicapped by the smaller screen on my laptop does come with advantages beyond just those introduced by the constrained environment. I'm not tethered to a room or location to code. Sometimes walking away or changing my physical environment ends up allowing me to avoid an actual "break" -- just moving from the office to the kitchen (when I worked at home) or from my desk to the office-kitchen couch (now that I'm an office dweller) causes enough of a shift to get me thinking correctly again.

All of that aside, one area where the 4K screen shines is anything to do with multimedia, 3D and image editing. Being able to do detail work, zoomed in on a screen that accurately reproduces color -- and at a cost that's a fraction of 4K monitors -- is simply fantastic. I also find the brightness of the 4K TV I purchased to be excellent -- the IPS monitors that I own tend to default to a brightness setting that is almost excessive (yes, adjustable, I know). I purchased a TV that was evenly backlit and once I got everything configured properly, I don't mess with it. The size and amount of light it outputs is pleasant; it's large enough that I can use it with the lights turned off in my office (if I have a headache) and it keeps the room reasonably lit while not blasting so much light out of the display as to be obnoxious. Despite not being IPS, being large and me sitting a couple of feet from it, I didn't notice any issues with edges looking dimmer due to the angle -- though I mostly ignore the edges because I rarely have anything maximized on that screen. It's just not necessary when you have all of that realestate -- you tend to simply move less important things off to an edge and put what you're working on in the middle in a smallish window.

[0] And it sometimes is in the short-run, but if that code has any importance to the program, you pay for it.

[1] I'm not "Test Driven" -- I write tests "second" because I started writing code quite a bit before "Unit Testing" was a common practice and I find the act of doing "Tests First" to feel backward. Writing tests afterward seems to have the same benefits as doing it the other way around, though it sometimes drives code changes. I've been doing it this way for too long to switch and be as productive as I am. But I cover everything -- I once wrote tests to cover a configuration lookup class that was basically a bunch of retrievals from a dictionary. A coworker chided me for it until I informed him that I found four bugs (and the best part is that said coworker was the one who wrote the buggy code). Because of the way the configuration routine worked -- cascading from a set of values in the database with fallbacks to values in a file and then finally constant values, all of which would always be set correctly from top to bottom in our development environment, the issues I unearthed would have easily gone unnoticed until a customer discovered them.

I was just thinking about this, since my girlfriend bought a 50" 4k TV for under $250 on black friday. That's about the same size and resolution as my current array of 27" monitors, without having to use four outlets and every available video-out on my desktop.

I personally prefer 2 or 3 monitors rather than one large monitor. One monitor for code. Another for debuggin. Another for servers,VM, testing,etc.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact