I'm much happier with the high-DPI displays in my ThinkPad Yoga 460 WQHD and MacBook Pro Retina 15".
I don't like pixels, and I don't want to see them any more. I want to see the text and graphics I'm working on, not pixels.
So when I wanted a second monitor to go with those machines, I got a 24" 4K display. At 187 pixels per inch, it's a reasonably close match to the 210-220 pixels per inch on the laptops.
I use this monitor in portrait mode next to the laptop display. A portrait mode monitor combined with the laptop monitor is great. I've got a wide screen when I want it, and a tall screen when I want that. It's ideal for reading documentation and especially PDF files, because an entire page fits on the portrait screen. Larger monitors are not very practical in portrait mode.
I'm not concerned with cramming the maximum amount of stuff on my screen. I run the monitors at 225% scaling in Windows, and the default Retina scaling in macOS. So text is about the same size it would be on a lower-DPI monitor, just much sharper and crisper.
After I got my first taste of high DPI with that MacBook, I swore I'd never go back to a low DPI display. It is so nice to not have to look at pixels any more.
Until recently, I used a 40" 4K monitor. For whatever reason, TVs handle input switching much, much more smoothly than monitors do: If I have two machines hooked up to different inputs of a monitor, and reboot the one I'm working on, the monitor will often spontaneously choose to show me the other input. A TV in the same situation will happily show "no signal" and stay where you told it to stay. I've only tried two brands of monitor (Planar and Philips), admittedly, but they behave identically in ignoring the manually-set input if some other input seems more interesting. :/ I had been using a Philips 40" 4K monitor for a couple years, and the frustration with input switching drove me to the same Spectre TV from the article, which is frankly worse in every picture-quality-related metric than the Philips, but doesn't instantly switch inputs when signal is interrupted, or blank and reblank the screen 2-12 times when it hasn't been on for a while.
Well, that went a bit off track.
Also, if you want a fantastic site to compare tvs/monitors apples to apples, you can't beat rtings.com. They grade every device on the same rubric so it is easy to compare and contrast. You can even tell it which qualities you care about most (or what use case) and they rate them accordingly.
Under Linux, situation is not as bad, since you can persuade xft to optimize font rasterization for readability. OSX makes the choice towards glyph shape fidelity, though, even at lower point sizes, which results in horrible anti-aliasing blur.
The DPI looks like 1080 on a 21" years ago again, but the real estate is enomorous and I certainly don't mind it. Sure pictures/movies on 40" is not as pixel rich as a 32" 4K or 27" 4K, but you'd have to mount multiple to match the real estate of a 43" monitor.
It's true though the scaling makes text larger but not reducing DPI as the picture looks much better on a 27"/32" than the 43". But without scaling, my aging eyes can handle anything greater than 110 PPI. https://www.sven.de/dpi/
This wouldn't be an option for me. Although I do usually work at my home office, I also need to be able to use a laptop on its own when I'm traveling or just want to get out for a while. So that determines the 20" focus distance I need.
With single vision prescription glasses tuned for that distance, my eyes don't have to do any of the optical accommodation you mention. My glasses are set up to let my eyes be relaxed at 20"; there's no strain of my eye muscles trying to adjust to a different distance.
There was a site I was reading some time ago that mentioned with glasses, the geometry of your eyes converging on point nearby (ie slightly 'cross-eyed') doesn't correspond with the eye muscles relaxed, for focusing at 'infinity'. The 'control systems' are likely coupled. I've used glasses with smaller screens, and I've noticed my eyes behaving a bit strange (perhaps seemed harder to focus) when alternating between glasses/no glasses. No idea if there are any other consequences. Maybe it's fine.
The 40" had total crap quality but still had some advantages in terms of usable screen real estate.
There are a bunch of 40+ real monitors now though: Dell P4317Q, AOC C4008VU8, LG 43UD79-B, Philips BDM4037UW - they are all using the same panels as TVs I assume, but hopefully the higher quality ones, and their electronics will be designed for computer use.
This is like a reading prescription, but adjusted for a farther distance. Reading glasses are typically adjusted for a 16" focal distance. That's much closer than you would use a computer. If you use a laptop on its own without an external monitor, that's usually about 20".
I took my ThinkPad to my optometrist the first time I did this, so we could be sure of the correct distance. When I saw how crisp and clear my screen was with the prescription, compared to the blurry mess I'd been suffering with... if I may indulge in a pun, it brought tears to my eyes.
The first thing you'll want to do with a new TV is to take it out of that mode. The "correct" mode depends on the TV and some internet searching can help you out, but putting it in the cinema/movie mode is at least a first good guess.
Finally, of course, you need to feed the 4K monitor a good 4K signal, or the 4K monitor isn't going to look any better. And as the article mentions, that's sometimes easier said than done, because HDMI can't always handle it.
If after all that, you can't tell the difference at all between 4K and 1080P at point blank range on a monitor setup like that, you may have eye issues, yes, absolutely. And you may still find that you don't much care about 4K. I've got 4K on my laptop, which is nice in a couple of ways, but in general I'm really rather "meh" on it for development work. I've got two 1080P monitors at work and prefer that to a 4K setup. But I also don't care about pixelization very much compared to a lot of people, and never have. I used to read books on my 160x160 monochrome Palm Pilot, and I guess everything looks good after spending my formative years on a Commodore 64 with a 1980s vintage television. But I do at least see the difference and can easily understand why others have other opinions.
Enough screen real estate that you don’t feel pressured to add another monitor, high enough DPI to look good, low enough to not need scaling, connects to USB C MBP with one wire for video, charging, and a USB 3.0 hub.
IMO the perfect monitor.
Plays nice with my PC and 2015 macbook pro.
>It’s a 38” 21:9
>I had a 34 inch with a second 27”, but a single 38” hits the sweet spot of not needing another monitor.
For a short while I had the second screen with the 38” screen but it was just overkill (the 38” alone is enough to cover a large portion off the usable area of my desk, so the 27” mostly just sat out of view)
I had a 34 inch with a second 27” monitor, but a single 38” hits the sweet spot of not needing another monitor
Obviously, different people will have different tastes here.
Of course if I take off my glasses, then every monitor looks the same. But with proper correction, my vision is better than 20/20.
With my prescription computer glasses, the difference between low DPI and high DPI is like night and day. I can see every pixel on a low DPI monitor, while high DPI monitors look beautiful to me.
Please, everyone: do yourself a huge favor and get your vision checked and get some single vision prescription glasses if they are called for. It's hard to overstate the difference this will make in your quality of life.
Edit: Not sure why you were downvoted for a reasonable comment. It's true that your vision will get worse as you get older. Even if nothing else goes wrong, you lose your ability to focus your eyes dynamically. The lenses harden up and lock into a specific focal distance.
However, many of these vision changes are correctable. Progressive lenses are a great solution for "out and about" use. You can look up and down to see far and near things. They aren't very good for computer use, and that's where the single vision lenses come in.
It is a minor nuisance to switch back and forth between the computer glasses and the progressives, but being able to see clearly makes it all worth it.
Coding without seeing pixels in font rendering is just bliss, it is too hard to go back.
Though I’m dreading the day I have to get varifocals.
One big consideration for me: I'm 65, so my eyes are not able to focus back and forth to different distances like a younger person.
My solution is a pair of single vision prescription lenses, adjusted to the distance to my monitor(s). I need to be able to use the laptop all on its own, so that determines the focus distance: about 20".
Because of that, I also place any secondary monitor at about the same distance from my eyes, otherwise it would be out of focus. I can't go changing glasses to look at a different monitor. :-)
A 24" 4K display works nicely for this, and as I mentioned it's small enough to work really well in portrait mode.
So I have two recommendations for any programmer. If you haven't used a portrait mode monitor, try it out alongside your laptop display. It's a very practical setup.
But more importantly, if you find yourself gravitating toward lower-DPI monitors, it may be time for a vision check and a good pair of single vision prescription glasses adjusted for your monitor distance (again, about 20" if you ever need to use a laptop by itself). I put off doing this for years - I always thought of myself as the kind of person who "didn't need glasses."
When I finally got the prescription lenses, it was a real eye-opener - pun intended!
I have converted many professionals away from TVs because when you can actually have real quality monitors to compare them too it's usually a no-brainer to abandon the cheap TV. No trick, no incentive for me, just the better choice. It is also why every employee in our company has a pair of IPS displays.
I've looked at 4K TVs as options for a monitor multiple times, from the early Seiki TVs that hit at the start of the pricing change to what's been on offer for Black Friday, don't do it to yourself unless you are spending most of your time gaming/consuming content on the same system or just using it to monitor things.
If you are going to be sat in front of displays for 10,000 hours a year working/focusing (like many of us I am sure) then investing in something like a pair of Dell U3415W, LG UM95 or LG 31MU97Z-B will give you a much nicer experience. Totally appreciate that isn't a $300 solution, but as professionals we should invest in (or demand from your employer) good tools.
To those of you who are using 4K TVs, it is obviously hard to get in front of these monitors and see the difference, but if you can you should - you've already made the choice to go for something better than a cheap 22" 1080P monitor, there are a lot of options to create a great environment to do your best work.
The article is just pure marketing anyway for referrals, so make of that what you will.
Presumably on a time machine so you can go back and repeat 52 of those days.
Or put another way: there's only ~8760 hours in a year total. Most people (~ 40 hours a week types) work around 2000 hours a year.
I dislike multi-monitor setups. Dunno if it’s the bezels or what. A single unbroken surface feels better to me.
For a VC company’s web forum, some HNers sure find making money offensive :D
I tend to tile my windows and use the different displays for the different 'flavours' of my work.
No distractions from another screen, just focus on one single task. Also, no turning my head all the time.
Would be curious what your use cases are, and what the differences are that you've experienced vs 4K TVs as monitors. I moved up from 6 x 27" 1080P monitors and like this arrangement far better. No fussing with multiple video cards, fewer bezels, and gained the equivelant of 2 1080P screens into the bargain. All for $1300, 1/4 of the prior setup cost.
I would love to see a photo of this setup. It sounds simultaneously awesome and ridicules!
I have a 27" 4K monitor at 150% scaling and it feels 'big enough', I was considering getting a second but then I'd have the bezel in the middle, so three would probably be better, but then I'd have to turn my head a lot to see the outer screens, so it seemed a bit pointless.
Maybe I'm an outlier, but I prefer to have applications full screen and just focus on that one thing. I also don't work with videos playing in the background as I find that very distracting, music is ok as I can zone out (and then realise for the last hour I've had headphones on with nothing playing), so I don't need to see a whole Spotify playlist, artwork and recommendations when I'm writing code.
If I'm doing web development, I'll have Vim in a terminal full screen, and when I need to test something live I'll switch to Chrome - my screen is big enough that I can have the page and debugger open side by side without it feeling squashed. I have tests and the compiler running in another tmux tab, so the terminal bell will alert me if something needs my attention there.
:) I basically divide the screens into 8 1080P windows. A "basic workday" for me is usually:
1. Chrome with Email, hangouts, and other administrata
2. A second Chrome with web dev.
3. A third 1080P with the web inspector for above.
4. An SSH window to the nodejs server
5. pgAdmin to postgres
7. Xencenter if I'm juggling VMs around
8. Choice of: another chrome for goofing off, or a bunch of smaller windows with git bash, filesystem windows, or other little distractions like spotify.
It's hard to tell from this photo (lol messy desk) but I sit about 80cm from the screens. There are some downsides:
I'm too close to the TVs for gaming -- they're too bright even at minimum brightness setting. One "all white" explosion cutscene and my eyes will hurt. I'm in the habit of closing my eyes when I sense impending white-out.
The "curved" monitors are only subtly curved, and I still fly them in a "Vee" configuration. I could've saved $200 by just going with 2 flat monitors. It seemed cooler at the time than it is.
I see others talking about fighting firmware and whatnot, but these MU650Ds from Costco (which I'm told are identical to MU6500s, but re-numbered to avoid price-matching...shrug?) just worked out of the box. Setting the "device" to PC HDMI and the TV took it from there. I dinked with the color settings to taste.
It's weird, my laptop is an XPS13 with the 3200x1800 screen, and when I try to do ANY splitscreen work, I find myself edging closer or squinting. I don't wear glasses, but I think I must just be a "not a HiDPI" guy. square inches on screens are cheap and cheerful.
I haven't found it hard to focus on a "tile" to work.
It's nice to step through the node debugger by flicking VSCode fullscreen, hunting and pecking through the 100-line-high code, all on the right side while the SSH/Chrome/Chrome debugger/pgsql windows sit on the left and I can watch them whirr with each step. Works for me anyway.
ATM I have a fairly old 32" 1080p TV as my middle screen.
It becomes hard to find the desk space for three monitors when you get to a 40" middle screen though. I prefer having multiple monitors than a single larger monitor. I haven't decided how I'm going to upgrade yet.
I've had IPS monitors for 15 years and maybe it's in my head at this point compared to high end TVs and TN panels, but I wouldn't use anything else nor would I subject those I expect to drive revenue for me to do so. It's a key tool that is often an afterthought.
End of the day whatever works for you, but I have seen first hand people have huge TVs on a desk thinking they are getting some kind of gain and then they look up and down all day long or squint to see whats on it. A lot of people have to be forced to adhere to good ergonomics and healthy use of computers.
Hardware is cheap compared to what we pay for software, SaaS, pension and perks for our staff so it's something we are good on.
Also, many TVs are 60Hz IPS displays, but you have to do some research to find ones that display in 4:4:4 color. After that the main difference between the two is the firmware on the device, some things like VESA power saving work differently.
TVs can be cheaper because they have larger economies of scale than monitors. It doesn’t have much to do with the underlying technology.
It seems like you don’t know as much as you think you do about display technology.
These all sound ridiculously low dpi - mid 80s.
My 17" pre-retina MBP does 1920x1200 - 130-something dpi and that's so old Apple don't sell parts for it any more.
How on earth are you using these or the previous ones, that everything, and particularly text, isn't just atrocious to look at?
Last time I bought a laptop, all the reasonably priced ones had 1366x768 displays. I hate it! The pixels are very visible. Finally picked up a 14" 1920x1080 Chromebook for web use and I couldn't be happier, it's so much nicer looking.
A kilobuck will get you a 30" 2560x1600 Dell UP3017.
Distance from the display is as important as DPI when discussing pixel clarity.
Currently the display you want to get around 40" is the Philips BDM4037UW, which has no PWM, relatively low input lag and backlight bleed is low.
There are some cheap 43" 4K monitors (Philips and LG), but they're lower quality e.g. LGs have PWM.
Have to say I'm very happy with it.
No ghosting, really excellent colour (once calibrated), minimal backlight bleed (it's not an IPS panel so has less bleed anyway) and 4k at 60hz.
Been using it for just over a year now. Only thing it does is pure blacks on pure whites can streak a little when scrolling. So very large and thick pure black text on pure white background will leave a very short trail. Normal text doesn't do this, it has to be large areas of black on white. It's actually not annoyed me much as it use this happened only very very rarely on some websites but it's a limitation.
Not sure it's best for gaming (though I've played Mass Effect and Rocket League with no issues) but for productivity work it's been great.
Pretty cheap too!
I have it on my desk right now and ghosting is really, really bad. Static contrast ratio is best in class, second only to oled monitors.
If you are looking for large office/work monitor only or you can borrow one with no strings attached, go for it. Otherwise stay away.
A huge boon for productivity having come from multiple 27" 1440p monitors before, and the BDM40 is something I might recommend for somebody looking to get in and get in cheap.
I'll definitely be splurging for something better next time however.
it's 4:4:4 chroma at 60hz so it's great for gaming too. and has a lower response rate than you'll get with TVs.
Highly recommended and it's hard to use anything smaller now.
Without using display scaling I wouldn't go lower than 40" and I think maybe even closer to 50" is ideal in terms of keeping the scale comparable between other periphial monitors. a 48" curved display would probably be perfect.
Source: I own and use a BDM4065UC which I'm about to return, and I've been able to test another (previously owned) one, where the owner, who had a totally different setup from me (mbpro retina 2015 is my workstation, his own was a custom-built desktop with nvidia card); many other people on various forums complain about that, and Philips support seems to know about the issue perfectly.
I have been interested in BDM4350UC and/or BDM4037UC as well, but both have been reported to suffer from extreme ghosting/persistence; I'm now looking forward to buy an LG 43UD79-B.
What I agree is: a large monitor makes the difference. Not in the workflow per se, but in how much eye strain, headache, back pain (from slouching to get closer to the screen) and so on you get at the end of the day - which is almost inexistent with such large displays. Even if it's $700 for the LG, that's well spent money.
Please update if you do end up investing in a new replacement, the LG especially.
Terrible VR accident... punched it while playing Superhot? ;)
It's also nice to have the physical separation for some reason I can't really describe. My brother has a similar setup to the one in the article, and I just never really liked using it.
Just personal preference I'm sure.
(I also fear my eyesight might be too bad to have a big monitor in 4k, with things being rather small text)
For my setup, I prefer having (bright) IPS monitors. Not too large, I'm fine with my 24" 1080p monitors.
Something to do with marketing and how the two businesses differ (TVs B2C, monitors B2B)?
Or are there actual major differencies in the tech? Like what panel is used or something like that?
Add to that the different features - higher DPI, refresh rate, contrast ratios, etc. Cutting edge features such as NVidia G-Sync and the similar feature from AMD don't come cheap.
I, personally, own one of those TVs (Seiki 39") and both love and hate it. I've always seemed to buy TVs for day to day computing because you generally get more cutting edge features for better prices (My first flat screen was a 27" WestingHouse TV. First 4k, Seiki. ETC).
Economics of scale factor in heavily - more TVs will be sold than computer monitors leading to better prices. Which is why I could get a Seiki 4k for $400 when similarly sized 4k "Desktop" monitors were going for $1000+.
The most annoying thing has been "Turn of monitor, Computer considers it unplugged". So you go from 2 monitors to 1 - and all the icons rearrange. Computer goes to sleep? Monitor turns off... disconnects... Annoying.
The price of saving $600+ for 4k (At the time).
I've recently upgraded to a smaller "computer" monitor that's better suited for gaming and desktop use... and the Seiki is now my computer room Plex Player.
These Samsung TVs, once configured properly (Source > HDMI 1 > Tools, select device type as PC, turn off HDMI UHD Color (which blurs), turn on Dynamic Color as well), are excellent, and at the 28" viewing distance I use, are absolutely crisp (no pixels visible) in Retina mode under macOS.
And I'm using it over HDMI 30Hz with absolutely no flicker, and no (discernible) mouse lag.
The 2015 models don't lose their mind when the Mac goes to sleep, unlike the later models, which require re-configuring each time (the TV "forgets" the HDMI port).
Can't recommend them more, especially since you can get these older models refurbished for $400.
Everything online tells me the display lag is unacceptable, but I haven't noticed it personally even when FPS gaming. Maybe it's the conditioning from gaming on an underpowered PC growing up :)
40" is roughly 1.5625 times in area of a than 32" though...
Yes, TN is even worse. One must have an IPS panel! :-)
(I hear, though, that TN, and maybe VA, might be preferred by some for high-performance gaming situations, due to a lower latency.)
"You need to elevate this beast vertically otherwise you’ll be craning your neck down to look at it."
A 40 inch TV is at least 50cm high. You should not be looking up - it's very bad for your neck. Looking down is no problem. The upper edge of the monitor should be at eye level.
If you have the opportunity, though, monitor arms are one of the best ergonomic investments you can make. Being able to reposition the monitor with a touch for whatever position or attitude you want helps reduce fatigue enormously.
Now I'm using 32"4K and this is the sweet spot for me. Not too big, but large enough to use without scaling (I'm running Windows).
I have a Sony x720e for my macbook for work and my PC for light gaming. It has graphics and gaming modes that have low input lags, but there is a really annoying quirk where when you turn the tv on it has normal (high) input lag until you switch to another mode then back to graphics mode. This means every time you turn on your PC (or plug in your laptop) you have to:
1. Turn on the TV
2. If necessary, switch inputs
3. Options -> Scene Select -> up arrow, down arrow
1 and 2 are annoying since monitors do that automatically but the TV doesn't. 3 is really annoying but hopefully is fixed in a firmware update...
Other than that the screen looks great.
I'm hoping for time to hack on an arduino with an IR LED to automate those steps, haha.
To get an idea of the size, here is a picture of me sitting next to it. 
I'm not here to argue the pros and cons, and I am not a lawyer, but I'm intrigued about the author saying -- and I paraphrase -- "I'm doing this on a computer that's breaking the licence agreement". That's an admission. I'm not judging, it's an observation. Hackintosh admissions on a personal site where they can be traced back to a person seem a bit risky to me.
I have no knowledge of Apple legal circling. Perhaps I'm just paranoid.
(edit, lol, fail. Didn't realize asterisks cause italics. Sigh)
From my experience with Hackintosh, you're downloading a canned VMWare image of a pirated OS. I'm not exactly sure when they would say you agreed to anything when you downloaded it, since you only have interactions with VMWare Player (or VBox) and the site hosting the VMWare images, plus a few of the BIOS patches that are needed to make it all work. you never interact with Apple in any way.
The guy who keeps packaging it is running a risk of legal action when Apple gets their fill of PCs running MacOS, though. In the meantime, though, it allowed me to shelve my Mac Mini for iOS development and keep everything in one PC, albeit one virtualized environment.
Being able to copy/paste between PC and Mac is fantastic for workflow.
I am a big fan.
I probably get burned by something deep in the Apple Developer Program EULA where I agreed not to do unintended things with Apple software. :)
It's still a violation of the EULA, but I think most violations would be substantially different from, for example, using a cracked version of Windows, in that most Hackintoshers have probably already given Apple $2000 for another computer.
And there are 4k monitors that can work with 4 inputs at the same time:
For me, as long as my code is readable, it doesn't particularly matter. I'm not doing graphics 95% of the time and when I do graphics, it only really becomes necessary to have a high contrast, high pixel density screen. I run dual Samsung 2770HD TVs at home as my monitors, they sit 2ft from my face and they're perfectly adequate for me and the price was right. I guess if you're anal about picture perfection, then you're going to have a very different opinion of what is "best" and mine are certainly a loooong way from the "best."
When I watch a TV show or movie though, I don't get too hung up on the picture quality to just watch the movie and enjoy the story line. Of course, I grew up with a black and white TV in my room until I inherited the colour one that we had to tune with a dial and you got some snow interfering with the picture if the aerial wasn't just right. Oddly, it didn't spoil my enjoyment of the show or movie that I was watching even slightly. The story line is way more important than the picture quality.
For coding, the story line is the same... it lives in my head. The screen is really just a means to share that. I get no less enjoyment coding on a screen with a slightly lower pixel density than I do coding on the latest retina. Even after I've sat in front of one that's truly high quality, it doesn't take long to adjust back to a lower quality screen.
Anyone have any experience with the top TVs on this list? The top entry from TCL seems to be the leader, by far, in terms of average customer review, but the price point seems almost too good to be true: $379 for the 55' version, though that may have been a sale price, and the TV is currently out of stock.
(I haven't bought a TV since...ever, no that I think about it)
definitely not as cheap, but still way cheaper than a comparable monitor.
stay away from the LG TVs. they don't have true 4k RGB.
One thing that is certain is that people have different preferences. Certainly some people would enjoy the tiny pixels of the 27" 4k monitor at 100%.
My current setup is 2x24" 1920x1200 monitors. I have considered 3x24, 1x Ultra wide or 2x Ultra wide, but I'm not convinced I'd get the same comfort. I think that 3x 24, or 2x ultra wides would require to much head rotation to be comfortable.
I prefer the vertical break in the screen since it makes snapping windows easier. I'm starting to think that I might prefer a bit more real estate, as I often collapse the vertical file menu in my editor for a little extra room.
Lately I have found myself wishing there was a wider format 5760 x 2160 screen of about 37-43" for 6 full HD screens-worth. But can't have everything I guess.
I have two 27" 2560x1440 displays at work, but I find it counterproductive to look side-to-side constantly to see what's on the other screen.
I love the 4K pixels but 100cm/39" is simply too much surface area for a monitor used for primarily small text. My next monitor will probably be a 69cm/27" 4K and hopefully OLED if they fix the burn-in issues that make OLED unsuitable for computer use.
OTOH, upgrading my home setup (which I use for side projects, etc.) will only make my work setup seem worse - they gave me 2 22" 1680x1050 monitors. I've only been here for 2 weeks and haven't been given any actual work to do yet, so I don't know how annoying that low resolution is going to be (but I'm sure it will be). Not nearly as annoying as the loudly buzzing/rattling vent and the headache-inducing blazing fluorescent tubes directly over my head, I'm sure (though, combined with the gray carpet, the gray walls, the gray ceiling, and the lack of any and all natural light, I must say, they've really nailed the office environment from "Joe vs the Volcano")...
I’d love it if someone made extremely large format displays (100”+ at 3:1 aspect ratio) for computer use. Not sure how large the market would be, but one giant display is nicer than 3 separate one so.
My rule has historically been to spend more on the peripherals like chairs, desk, etc. than a computer because my body is a lot more expensive than my computer and accommodating my body’s needs comes first. With that said, I use a $60 medical stool per spine surgeon’s recommendation and I have a $80 IKEA desk to support $6000 of hardware because I don’t get much more utility beyond a certain point and because I move so frequently a heavy desk would be a bigger problem for me than helpful.
Also, don't forget to turn on a possible 'gaming mode' (on the TV) which significantly improves the input lag.
Here is screenshot of a layout I use https://camo.githubusercontent.com/79de6fc01f5370345d4b9c4c9...
The above uses XMonad and a custom layout available here https://github.com/chrissound/XMonadLayouts if anyone is interested.
Also the 4K TV I was using only had a 30hz refresh rate, really bad for eye fatigue.
This is a modern project which is attempting to pull this off in Linux.
Tech specs are an easy way to compare stuff, but they don't always say everything.
For those who have been neglecting these things, a good stretching routine and daily posture work is probably the best starting point.
Been doing that for years and I can find stuff fast.
Does look cool though.
A bad 720p laptop with a tiling wm has more available screen real estate than a 4k monster regular setup.
It has gotten to the point where I feel I am watching something quaint and retro when I watch someone manually moving windows around on a screen.
> On a laptop screen or normal-sized desktop monitor, I end up putting these different windows on different macOS spaces, but then half the time I swipe the wrong direction to switch spaces and end up going to the wrong space. By the time I get to the correct space my short-term memory is blown and I’ve forgotten what I was doing.
I've never had this issue on a tiler. You get 10 (maybe 20) screens accessible with a single key combo. You can thoughtlessly toggle back and forth between any two as fast as you can blink. "Swiping" between a few desktops sounds terrible by comparison.
I'm do embedded development, PCB CAD, some graphic design and manage a fleet of servers all from a single 10" laptop screen thanks to tiling.
Unless you were asking how a 720p screen can have more real estate than a 4K screen? 4K is 3840x2160 or 8.3M pixels. 1280x720 is 0.92M pixels. So with 10 virtual desktops the 720p screen gives you 9.2M pixels. Slightly more than a 4K screen.
You also get the benefit of not having to move your eyes or head from one corner of the screen to the other. What you are working on is always front and center.
99.8% sRGB coverage and 81.3% DCI-P3 and very little input lag. It's also a Smart TV. So far, I'm happy with it. I run 100% scaling and don't see any pixels when I sit far enough so I do enjoy it.
My main workstation is connected to a 55" Vizio HDTV (a model that supports Chroma 4:4:4 and represents text of various colors very well) that I picked up last year during a post-Thanksgiving sale. While I love it, I do the lionshare of my development using a 1080p display connected to a laptop. Is the 4K screen superior? Absolutely -- in almost every way -- except for portability.
I made the switch from multiple 1080p monitors to a single laptop monitor a few years ago and ditched the multiple monitors for the single 4K (which had a total resolution about equal to my multi-monitor arrangement). The transition was tricky at first, though, and coupled with a visual studio extension that aided in navigation/visual identification of important code points (something I wrote for myself), I found downgrading to the 1080p single display to take far less time to get used to than I expected.
The one drawback turned out to be a positive thing in the long run: by not being able to see so much code on-screen at all times, it forced me to write code more carefully. Where I could have gotten away with not splitting a class up into smaller pieces, before, because I could see the whole thing on-screen, I couldn't do that any longer and continue to be productive writing code. Sticking with small methods that "do one thing" and small classes that have a single purpose is a best practice and it's one I was pretty religious about before, however, there are times when it's necessary to "get the code out the door" and corners get cut, mostly out of a false-sense that writing it dirty will be faster.
There's one place where the 4K screen is vastly superior, though -- debugging. You can get your code, the locals/watch and the application visible in one screen. This, too, turned out to be a double-edged sword. It was easy to skip seemingly unimportant unit tests when I knew my debugging environment was pristine. But the reality (for me at least) is that the moment I have to hop into the debugger and check locals/watch to see what's going on, I can expect to waste an hour fixing a problem 80% of the time. Here's the thing, though, writing tests to cover my code rarely takes more than an hour or two but the simple act of doing so causes me to rethink the problem from a testing standpoint. I'll often find the bug before I've ever run the test to validate that it passes -- and about 1/3 of the time, that bug was subtle. Less often, but still frequent, I'll discover during the act of writing the test that the code could be refactored in a more logical manner, which results in better code. These facts mean I write tests regardless of having a great debugging environment, and I'm obsessive about it -- covering code that seems obvious -- so I end up rarely needing the benefits that a 4K screen offers.
Being handicapped by the smaller screen on my laptop does come with advantages beyond just those introduced by the constrained environment. I'm not tethered to a room or location to code. Sometimes walking away or changing my physical environment ends up allowing me to avoid an actual "break" -- just moving from the office to the kitchen (when I worked at home) or from my desk to the office-kitchen couch (now that I'm an office dweller) causes enough of a shift to get me thinking correctly again.
All of that aside, one area where the 4K screen shines is anything to do with multimedia, 3D and image editing. Being able to do detail work, zoomed in on a screen that accurately reproduces color -- and at a cost that's a fraction of 4K monitors -- is simply fantastic. I also find the brightness of the 4K TV I purchased to be excellent -- the IPS monitors that I own tend to default to a brightness setting that is almost excessive (yes, adjustable, I know). I purchased a TV that was evenly backlit and once I got everything configured properly, I don't mess with it. The size and amount of light it outputs is pleasant; it's large enough that I can use it with the lights turned off in my office (if I have a headache) and it keeps the room reasonably lit while not blasting so much light out of the display as to be obnoxious. Despite not being IPS, being large and me sitting a couple of feet from it, I didn't notice any issues with edges looking dimmer due to the angle -- though I mostly ignore the edges because I rarely have anything maximized on that screen. It's just not necessary when you have all of that realestate -- you tend to simply move less important things off to an edge and put what you're working on in the middle in a smallish window.
 And it sometimes is in the short-run, but if that code has any importance to the program, you pay for it.
 I'm not "Test Driven" -- I write tests "second" because I started writing code quite a bit before "Unit Testing" was a common practice and I find the act of doing "Tests First" to feel backward. Writing tests afterward seems to have the same benefits as doing it the other way around, though it sometimes drives code changes. I've been doing it this way for too long to switch and be as productive as I am. But I cover everything -- I once wrote tests to cover a configuration lookup class that was basically a bunch of retrievals from a dictionary. A coworker chided me for it until I informed him that I found four bugs (and the best part is that said coworker was the one who wrote the buggy code). Because of the way the configuration routine worked -- cascading from a set of values in the database with fallbacks to values in a file and then finally constant values, all of which would always be set correctly from top to bottom in our development environment, the issues I unearthed would have easily gone unnoticed until a customer discovered them.