Hacker News new | comments | show | ask | jobs | submit login
Dell 38 inch UltraSharp monitor (anandtech.com)
207 points by dgelks on July 22, 2017 | hide | past | web | favorite | 290 comments



Off-topic but I wish 16:10 hadn't died with the move to 4k. I have an old 1200p dell ultrasharp in 16:10 and it's sublime compared to my other 16:9. You'd think it doesn't make much of a difference, but it really does. Especially when you have it vertically oriented, but otherwise too.


Golden ratio strikes again!

Historians in a millennia will probably look back at us as heathens for deviating from using the perfect rectangle to construct our screens.


hopefully in a millennia mathemeticians will have finally convinced the world that the golden ratio really isn't special.


It's special. Just because people overload it with spurious claims doesn't alter its mathematical elegance. And seeing as how it falls out naturally from geometric figures like pentagons and pentagrams, people are likely to stay interested in it for as long as humans have 5 fingers.


Nor does it make sense for a monitor. I can't remember ever wanting to arrange a bunch of windows as a square, a smaller square, a smaller square, etc.


You must not use a tiling window manager.


I must not want my wm to give me a geometric series of ever shrinking squares.


What else does it default to, if you open a new window?


Does/did make sense. European paper size (A*) are based on the golden section (so A0 Length == A1 Width and 1/2 surface A0 == surface A1 and so on).


Not quite. ISO 216 paper sizes are designed so that if you split them in half you get the same ratio (1:sqrt(2), or 1:1.414). The golden section is so that if you take a square out from one side, you get the same ratio (1:1.618).


Hopefully historians in the future will be sufficiently advanced that when they look back on us 16:10 ratio displays won't rate the top 500 things they are horrified by.


It's less of a problem with 2560xN than 1920xN, in my view. I also prefer 16:10 to 16:9, but 2560x1440 has enough pixels vertically not to be disagreeable, and 1440x2560 is wide enough not to be cramped. I'd never buy a 1080p monitor, but I've found 1440p a most worthwhile upgrade from 1200p.

So perhaps sit it out and wait for 5k? ;)


I can't stand 2560x1400 -- I swear by my 30" 2560x1600 (16:10) displays at home and at the office. The 16:9 ratio is just too skinny for anything but movies.


Yes, I've found 30" 2560x1600 (16:10) displays, specifically Dell U3014 are the current ideal. I expect in a few years I will cave in and purchase a pair of Eizo FlexScan EV3237-BK 32" 4K screens, but for now 30" 2560x1600 is a good compromise between cost and practicality.


16:9 2560 monitors are too wide, and certainly don't work portrait. I end up running them with black bars on the side to get a comfortable layout.


Well, it definitely takes all sorts, because I use 2 side by side in portrait mode and I'm quite happy with it ;)

You're right about the length of the longer axis, though... I've rarely found it all that useful to have one view fill the entire screen. So I generally either have two 1440x1280 windows/panels/etc. on a screen, or one 1440x~1700 and one 1440x~860. That works pretty well. Many window layout tools will let you quickly set up this sort of arrangement.


I can relate. Open up a page like this [0] in a maximized window on a very wide screen and try to read. I ended up writing a Chrome Extension [1] to narrow/center such pages.

[0]: https://www.haskell.org/tutorial/patterns.html

[1]: https://chrome.google.com/webstore/detail/center-pages/ooebg...


Why not unmaximize the window?


I don't like to see the noisy background. I also find it fiddly to do it manually. After I click unmaximize I will probably have to adjust each side of the window, which takes much longer and isn't really convenient (IMO).


Why not chop off part of screen?


Because it's a lot easier to undo unmaximizing a window


Also, I miss my 4:3 in portrait mode, was perfect for having code and docs open at the same time: 1200x1600 FTW.


I had a dream that monitor makers released a "4kPro" resolution of 4000x2500 (16x10), but then I woke up and was sad.


Same here.

I've been turning down monitor "upgrades" at work for years because the "upgraded" monitors are 16:9 and would be replacing working 16:10 monitors.


So, for example, would you refuse an upgrade from 1920x1200 to 2560x1440? Would you refuse an upgrade from 2560x1600 to 3840x1600?

IMO bitching about aspect ratios seems petty when new monitors offer so much more space than older ones.


I did exactly this, upgraded from 24-inch 1920x1200 to 27-inch 2560x1440. The aspect ratio change hasn't bothered me, precisely because of the increase in space. Much easier to have two files side-by-side now, for example.


If that upgrade was on offer, I'd consider it. Especially if it was a larger display.

Unfortunately, the only upgrades I've been offered are 2x 24" 1920x1200 -> 2x 24" 1920x1080.


I'm still lamenting the death of 4:3


Here's a modern 16:10 monitor: Dell UltraSharp 30 Monitor with PremierColor - UP3017 http://www.dell.com/en-us/work/shop/accessories/apd/210-ajgt


That doesn't strike me as modern. The resolution, size, and frame rate are on par with an Apple Cinema Display from 2004 (though the color is better):

http://www.everymac.com/monitors/apple/studio_cinema/specs/a...


So the thing that a lot of people miss is that unless you bascially 4x the pixel density and go anti-aliased... it doesn't actually help to increase the density when regarding reading.

The thing I've learned is that really I want a specific pixel per inch real world or I want a specific font size in inches real world. I sit comfortably back and the dell 30 lets me easily set my text at about 1/8" tall or 1/4" which is super easy to read relaxed in my chair. If I went for a increased resolution display, the font would just get smaller until I get to a 4x density... and I'm not really gaining anything until then either when AA smothing is a win.

But also since I jack my font up even more than usual for my distance.. I'm already anti-aliasing about 4x AA, so a 4x resolution bump would put me at 16xAA which is, I don't think, needed. I'd rather blanket my field of view with lots of large text than have one very nice high density display central that makes me lean forward and strain my eyes.


Comparing with a super-fancy Apple display doesn't seem that fair. Even today "1080p" monitors are not uncommon - for example the popular Dell XPS 13 defaults to a 1080p monitor. Also worth noting that 2560x1600 (the resolution of that monitor) is identical to the original Retina display.

If people really care about 16:10, it's available with reasonably good specs and they can buy it. If they don't, it will probably die permanently, but we can't exactly blame the market.


I think a "modern" 16:10 should be 3840x2400 (like the IBM T220, released in 2001) or 5120x3200. Neither of these exists in any display currently made.


I believe All the Macbook are still 16:10, but iMac switched to 16:9. So even Apple doesn't give a damn about it anymore.

What does it take for 16:10 to come back? Or is Desktop monitor as we know today are forever going to be stuck with 16:9?


Microsoft has decided the future is 3:2 if you're not averse to moving closer to a square ratio.


Considering how much stuff we do online and the gargantuan empty voids I get on a lot of websites with my 27" 1440p, I'd say they're right.


This. Right now I'm sitting in front of 13,312,000 non-UHD/4k pixels, all in beautiful 16:10 aspect ratio. I expect that I will keep this setup until I die.


I think I'd be ok with the 16:9 if we're north of 32" and 4k


4k got me to stop complaining about the loss of 16:10. A 4k screen at 16:9 is still not as wide as even two 4:3 monitors side by side -- and you still get the screen real estate of four HD screens in one. Is the aspect ratio my favorite? No, but with this resolution my attention is almost always limited to a subset of what's on screen. The aspect ratio of the whole thing becomes increasingly irrelevant.


I apologize if this is an incredibly stupid question, but why do these monitors cost so much? As I understand it, you can buy a good 4k TV for considerably less, so what features of this monitor make it a better deal?


I've been using a Samsung 40" 4k TV as my main monitor for development work for about 8 months and it's been great for me.

A couple caveats -- Because I don't game or do graphics work, I don't know or care about color reproduction. I made sure to get a TV, graphics card, and cable that supported 4k@60hz with 4:4:4 chroma so the text is perfectly sharp and the refresh is not obtrusive. It's not hard to find a Samsung TV around $350 that supports that, but as far as I could tell, I had to go up to a GTX1050 graphics card to get that. I sit between 24 and 28" away from the monitor so the text is readable to me at full resolution. At that distance, I do have to turn my head to comfortably read text in the corners of the TV, especially the upper corners. In practice, that means I keep monitoring-type applications such as CPU charts, Datadog graphs, etc., in one of those corners for quick reference. While I still have two 1920x1080 monitors on either side of the TV, it's quite nice to be able to open up on my main monitor a huge IDE window and, when necessary, put 3-4 normal-sized windows next to each other for comparison work.


Same here. I bought a Philips BDM4065UC monitor in 2015 (40 inch, 3840x2160 resolution) for 740 € (in Germany) and I'm loving it. The only cave-at with that model is that reaction time is a bit slow (I would guess 5 ms between full black and full white).

When I replace it, the replacement will be the same size and resolution, but I'll probably go for OLED and HDR-10 or better.


I happened to buy the same, and import it from Germany to the UK.

Was very excited for it as it featured a very similar DPI to a 27" 1440p Korean import, coming from 3+ 1080p monitors originally the lack of bezels was incredible.

It's not built for gaming at all though, extreme screen tearing and poor response times has me now looking to replace the 27" 1440p Korean import with something >144Hz and G-Sync, before replacing the Philips BDM40.


The tearing is the monitor's fault? I'd noticed some tearing, but attributed it to driver issues.


Yeah, I got that Philips too. For games 40 inch at 4K resolution is really a lot more practical then 38 inch with an odd resolution, and it's cheaper too.


Any chance of a picture of your Monitor setup? - I'd love to see it.


Here's an image of it while I was working on 3 or 4 documents for a non-development task: https://ibb.co/bWUYh5


I was expecting something much bigger, that doesn't look close to 40".


I think the fact that the 27" monitors next to it are closer to the camera in that photo makes the TV look artificially smaller. It absolutely dwarfs the 27" monitors. If you're close to any store that carries 40" 4k TVs, though, just go there and stand 28" away from one. I think you'll see what I'm saying about having to turn your head to comfortably read text in the corners, something that has never been the case for me with smaller monitors.


I just bought (today) a 32" 1080p monitor and I'm lovin' it. I think it's more productive to use a single big screen then set up a two monitor setup.

My old setup consisted of two monitors, one 23" and one 21". One day I realized that my 23" monitor could not reproduce colors. I needed to photoshop a picture. I sent the retouched version to my phone, but as I couldn't see the color depth on the 23" one, the image looked ugly on the better screen. At that time I learnt not to buy the cheapest monitor, just by looking at its size and resolution. So, I sold that monitor, and bought a 32" 1080p monitor at a bargain price.

Of course it would be better if I could buy a 1440p monitor at this size, (or even 4K OMG :)). But considering my budget this was all I could purchase.

I had serious doubts about the pixel density, however there was nothing to be afraid of. In usual tasks (e.g. browsing, writing code in emacs, terminal) nothing disturbs my eye, it's beautiful. However, if I open a 1080p youtube video fullscreen, it looks like as if it were a high quality SD video, because of the pixel density. I am standing close to the monitor, and although I cannot see any pixel points, I can notice the difference.

I bought my monitor for 900 Turkish Lira, equivalant to 254 dollars and I think it is a great investment.


Does 60 hz keep windows from lagging when moved?


You might be thinking of the issue early adopters had with TVs that only did 4K/30 over HDMI. At 60Hz, it's like any other computer monitor.


I detect no lag when moving windows around. It might not (or might -- I simply don't know) be good enough for gamers, but it's definitely smooth enough for development, document, or web browsing (aka productivity) tasks.


144Hz monitors do produce a very noticeable difference for gaming if you can rival the refresh rate with in-game FPS, outside of gaming I've only found it noticeable when moving windows about.

For productivity I'd go for the 40" 2160p over a lower resolution higher refresh panel.


Model # of the Tv?


Mine is a 40" Samsung UN40KU6300 though Samsung appears to rev its model lineup frequently so they may have already released a successor to this model. It was a little difficult to confirm that it supported 4k@60hz with 4:4:4 chroma, but I relied on the experiences of some folks on an audio/video forum who confirmed that they had been able to run with those settings. Unfortunately, I don't remember the name of the forum or have it in my browsing history. I do wish manufacturers would make that information easier to come by though it's certainly a tiny minority of consumers that know or care about those kinds of specs.


rtings.com is a TV review site that is superb for finding such details, they test for 4:4:4 chroma specifically along with input lag in each mode and one of their sub-cores is "use as a PC monitor". A valuable resource.


Thank you!


It's designed for graphic design and other display-critical tasks, so it is calibrated to 99% sRGB color space. What it looks like on this monitor will be what it should look like in print and on the best of every other display. Plus, it's the UltraSharp top-end model, so all the mechanical construction will be top notch.

TVs, on the other hand, are designed to show the most oversaturated, "vibrant" colors on the demo loop on the show-room wall. And mechanically, they're designed to hang on the wall and never be touched.

Plus it's curved.


> so it is calibrated to 99% sRGB color space. What it looks like on this monitor will be what it should look like in print and on the best of every other display.

That’s completely useless for graphic design. sRGB is defined as the lowest common denominator of CRTs on the market in the early 90s.

Actual monitors for graphic design purposes, every TV released in 2017, and most new gaming monitors use instead AdobeRGB or DCI-P3 colorspace or even Rec.2020, which is what is also used in cinemas.

For comparison, this is 100% sRGB (what you claimed is "for graphic design" vs. 100% Rec.2020 "what most new TVs and gaming monitors support"): https://dotcolordotcom.files.wordpress.com/2012/12/rec2020-v...


I still use an old 30 inch model from 15 years ago and it's only just now starting to break down, so that's another thing to think about .


Anyway, it makes sense that displays should have a much longer life span than desktops - which is one reason I don't fancy something like the iMac.


Similar! I'm still using an Apple 30" Cinema display I bought in 2005.


I have an UltraSharp and it's fucking rubbish. Half of the screen has a yellow tint, and it has dark corners. Obviously support says it's "within spec".


Not sure why this is getting downvoted... Seems relevant information to me.


He doesn't mention the model number, plus it's a new account, so there's little trust that what he's saying is true. (I didn't downvote)


The pricing is so odd. For $300 more, you can get a monitor with _much_ better resolution (5120x2880), though in a smaller screen size.

It's called an iMac.

https://www.apple.com/shop/buy-mac/imac/27-inch


It's not odd. The size is what makes this expensive. It's 10 inches more screen, and curved to boot (which you may not care about, but it does affect cost).


> curved to boot (which you may not care about, but it does affect cost)

Not too long ago the curved models used to be cheaper with big discounts, at least for TV's, because nobody wanted to have a curved TV.


This screen is much bigger (37.5" vs 27"), has a better refresh rate (75Hz vs 60Hz), and is curved. All of those things correspond with higher price for both the panel itself and the resulting product that contains it.

* removed mention of FreeSync as this display lacks it


Why do people want curved displays. They get glare from evey angle, and the image is distorted.


Some people want a single screen to always be the same distance. And most curved monitors geared towards business users are anti-glare as this one is.


I can't understand it either


This does not have FreeSync


Quite right, updated


I have a 5k iMac myself, but for my work, which is Linux based, I would rather have that wide-screen. The iMac for sure has the way better font rendering, but the real estate is that of a 27" screen. With Linux, HiDpi support isn't quite there yet - so for now the 38" screen gives you the larger desktop.


> but the real estate is that of a 27" screen

By default. A couple of clicks in Preferences will get you higher or very-high-indeed effective real estate.


10 years ago, I might have agreed to you :). But at a certain age, you are happy if you can focus on the default size resolution.


5K 27" displays are much less than this.


5K 27” with wide color gamut and good calibration? Please post links here.


LG 5K 27" display. 100% coverage of P3 gamut. $1299.

https://www.apple.com/shop/product/HKN62/lg-ultrafine-5k-dis...


This is a good display, and price has gone down. But anyway to connect it to a Windows 10 / nVidia PC and get the full 5K resolution? No GPU that I’m familiar with has support for USB-C.



Interesting! Too bad already bought a 1080Ti. Definitely good news for future purchases, though.


Not to mention all 2017 Macs are also 10-bits-per-channel, as well as P3, so they have “wide” color and “deep” color.


For photo / graphics work, wide gamut and calibration are important. For text editing, less so.

High DPI is important for me, though.


Internet is full of opinions.

A recent positive: https://www.techspot.com/article/1439-using-4k-tv-as-desktop...

A negative from 2014:

http://www.itworld.com/article/2832687/consumerization/don-t...

and some discussion:

http://www.avsforum.com/forum/166-lcd-flat-panel-displays/28...

The main headache seems to be blurry fonts which probably mean a lack of 4:4:4 chroma support or the inability to enable it.


I cannot speak for 4K TVs, but as I was making a research for my own purchase I found the following differences between monitors and TVs.

TVs are said to be configured to display the best-looking colors, monitors try to stay true to the real color.

TVs have an immensely higher refresh rate, 50hz/60hz usually suffice for a TV. But on monitors we speak of milliseconds. The refresh rate makes a lot of difference, if you intend to game on your computer/monitor. An e-sport professional Redittor claims he had improved his playing performance multiple-folds after switching to a monitor instead of TV. (https://www.reddit.com/r/FIFA/comments/5whb9v/did_the_change...)

Other than these differences and the occasional overscan/underscan problems on older televisions, I personally see no reason to prefer a Monitor over a TV, if there is huge price difference. If the price difference is minor, I'd opt for a monitor.


Refresh rate and response time are different. Refresh rate is how often a new frame is displayed, i.e. 50/60/144Hz. Response time is the amount of time it takes for the display to process a frame and display it, i.e. 1/2/4/16ms.


I stand corrected.

  >>Over a decade has passed since the LCD monitor unceremoniously ousted the boxy CRT monitor into obsolescence, but with that ousting came a small problem: CRT monitors redrew every frame from scratch, and this was baked into the fundamentals of how PCs sent information to the screen. Monitors redrew the screen with a refresh rate of 100 Hz, (100 times a second), and they were silky smooth.

  >>LCD monitors don’t have this problem, because the pixels don’t need to be refreshed.  
  Instead, they update individual pixels with new colors, and each of those updates takes a certain amount of time depending on what the change is. 
  The response time refers to how long this change takes. LCDs started with a 60-Hz refresh rate but a response time of about 15 milliseconds (ms).
For more information on the subject: https://www.tomsguide.com/us/refresh-rates-vs-response-times...


There's also factors like input lag that are separate from display refresh that can be a problem for people playing action games. Even casual players can notice some high input delay from time to time (we're into the 100ms+ range) so it's a good idea to check the display's input lag if games are anywhere near a possibility. While I like my 34UM95P for professional purposes it's really not that great for games that are much more than interactive movies. In contrast, nothing stops me from using an Acer x34 monitor for coding 95% of the time.


And it double-sucks, because here in Italy we have to pay "TV tax", even if you don't have the TV service hooked up. Modern LCD/OLED TVs are perfectly fine for most tasks, except maybe high-end FPS gaming. The colors are good, the refresh rates are decent.


That's an ownership tax on tv's as there's an ownership tax on cars. The fact that the proceedings of the Tv ownership tax pay for the public (national) tv service is just a side factor. As very few people were actually paying it, the Tv ownership tax has been embedded on the electricity bill. _Italians, good people_


E basta co sta storia del canone!

È una tassa, pagatela e smettetela di lamentarvi.


I was just saying that above, it's an ownership tax. A lot more people are paying it as it's now embedded on the electricity bill, so it's useless to recommend to pay it as it's now been basically enforced. I believe you should comment in english if you wish to be understood.


It's like how megapixels are a bad measure of camera quality. This is a pro monitor with accurate color reproduction, low fading on the edges, minimal dead pixels, etc.


I know someone who uses a 1080p TV as a monitor for his gaming PC. The picture quality is poor; it's too blurry to be tolerable for work. That's okay in this particular case, since he uses a different PC for work, but that one needs a proper monitor.

Granted that's 1080p, but it wouldn't surprise me if the same thing is true of 4k.


I think that the 24:10 aspect ratio alone will cause higher prices because of much lower economies of scale for the panel...



Related musing: why is it seemingly easier to make very high resolution small screen (phones) than large screens of the same resolution ? Instinctevly I would think smaller leds are harder to make, but it doesn't seem it's the case ?


There is a greater failure rate the larger you make apanel. a great many are discarded from production lines. If the screen is smaller then it is less likely to incur error.

Also that cost is hidden in a phone, i'm not sure what the value of screen components would be.


A large defective panel can be cut down into smaller panels most of which will have no defects. You can sell most of your faulty large panels as small panels, but there is no extra source of large panels.


Well the point stands that statistically its much easier to make a smaller panel than a larger panel. A larger panel being n times smaller ones.


The probability of manufacturing a bad pixel is some constant plus a function of the area of that pixel, and not the number of neighbours it has.


I'm going to guess the square law? This monitor has about 75 times the screen area of a typical phone.


Right, and a Phone could spend $50 to build that screen on a $700 device (not that they do). Just a raw 75 * $50 is near $4000, which would explain a high price.


The parent's point is that making a larger screen with _the same pixel count_ should be much cheaper, not one with 75x the resolution.


That is if it is the pixels per unit that lead to cost, vs the size of the display itself.

Obviously it is harder to make a 1 inch x 1 inch display with 1 million pixels vs 1000. But maybe the difference from 600ppi to 200ppi is not enough to matter...


I could be wrong. ¯\_(ツ)_/¯


Also, incidentally, I'd happily pay a bit of premium for a TV / Monitor that has good image quality but no other features. Perhaps not 2x as much, but 20% might be doable.


Probably a lot less volume being sold on ultra wide monitor panels. Therefore higher price.


High-end monitors are niche products. 40" TVs aren't.

HDMI 2.0 is a pain compared to DP, other than that, there's almost no reason not to buy a TV if you just care about office use. I've been using a Samsung UHD TV for almost two years.


Monitors are designed to be viewed at close range, from multiple near angles. TVs are designed to be viewed several meters away, at roughly the same angle.

TVs play content of the same frame rate, so there's no reason for them to be any more precise. Monitors can go at very high frame rate as the content can be very fast paced, especially gaming monitors.

Freesync/gsync also integrates with the graphics card to reduce lag. This is the most expensive part of these monitors.


This is not a gaming monitor, it only does 60Hz. And it does not have freesync. The article mentions both facts.


Oh I skimmed it to the stats. It was showing a competitor monitor's stats. Still the same principle.


Honestly, it just looks like price gouging. You can get a 40 inch LG IPS 4K TV for half the price.

And this Dell isn't even 10 bit IPS, which is what monitors for professional graphics use, it's the standard 8bit IPS, very likely made by LG.

It's curved though.


Several reasons:

- TV's are generally low framerate, as much as they'd like to claim 240FPS, it's mostly all 30FPS, with software interpolation to increase the frames.

- Bulk. TV's are solid in higher numbers, justifying the lower price

- Distance from face. Your 60" TV can be two smaller panels "glued" together. Not noticeable from watching distance, but having a monitor so close to your face, you're more likely to notice the millisecond tearing.


"- TV's are generally low framerate, as much as they'd like to claim 240FPS, it's mostly all 30FPS, with software interpolation to increase the frames."

That makes absolutely no sense. If the panel is incapable of 240 refreshes per second, how does software interpolation "increase the frames"? You are confusing content and panel.


5min video: https://www.youtube.com/watch?v=Sxvu7qf6rDw

Mainly it is that for more than 60fps you need dual-link DVI, display port or HDMI 2.0 and you won't find many TVs with either of those. So even though the screen can do more than 60fps the input and processor can't take it.


That’s not what was said in the comment. Obviously, the point of a interpolation is to take a signal with little samples and increase those. What the comment said was that the hardware (panel) was not capable, so the software somehow was able to do it. It makes no sense.


More like: the panel supports 240hz, but you can't get 240hz content into the TV (they don't exist and require high end interfaces), so you interpolate 30hz (which you do have) to 240 hz. Why go through all this trouble for fake 240hz? Probably because it sounds good as marketing.


> What the comment said was that the hardware (panel) was not capable

The word 'panel' is not in that part of the comment.

The hardware is incapable of getting high fps signals through most of the pipeline.


"Dell has managed to increase maximum brightness of its U3818DW to 350 nits (from 300 nits on competing monitors)"

Bah, why does everyone seem to love blinding themselves? Am I the only one who doesn't like a bright screen? (Ergonomically, we're supposed to have the monitor's backlight match the brightness of a white sheet of paper in the same lighting conditions.)


I imagine it's easier for you to dim the brightness to your preference than for others to somehow manage to tweak every bit of brightness out of their graphics card settings after the monitor's settings tap out

Use case: I'm in front of a window, if the sun's out and I've got the curtains open I might as well just turn the screen off


You would think monitor manufacturers would the brightness control go all the way to barely visible, but they don't. I have two monitors, an Asus and a Dell, both set to what they consider brightness level 0, and neither quite as dim as I'd like.


On most display devices, "contrast" controls the white level and "brightness" controls the black level. (These controls of course interact.)

For your use case, you might want to be adjusting contrast down.


You can get them to go lower at that point by decreasing the contrast. But yes, you highlighted what my problem with brighter displays is.


I work in a pretty bright office -- usually I have the windows open and the overhead light on. My monitors (a Planar and a Philips) are both plenty bright enough for this use case, but neither of them will get as dim as I'd prefer once it's dark out and I have only the overhead light. If I would prefer to work by lamplight only (as I sometimes do), the contrast between the super-bright-on-the-dimmest-setting monitors and all the peripheral space around them makes my eyes hurt.

And, here's the thing: I like brightness, usually. As @ianai mentions, what's needed is a larger range, not the same range but brighter.


Consider using f.lux (or a copycat on Linux) to dim and match the color temperature of your lamplight in the evening.


I have used f.lux and redshift in the past. They can help reduce the eyestrain a little, and I guess they have a psychological effect around sleep, but they don't actually dim the light.


Once upon a time it was normal to adjust the location of the desk to the person working there. If lefthander, place table towards the left wall with light coming from window on the right. If righthander, place to the right of window. The typewriter was placed on a small table in front of the window.

When the computer showed up, it was first placed in the corner toward the window or in front of the window but the later location proved hard on the eyes so we stopped with that.

What has changed now that we stopped with this practice? I thougt we would have the office layout perfected by now?


Not being a part of a hive mind unfortunately I did not have this wisdom inbuilt and apparently neither did the people designing my house with the window facing the sun that my home office lives in.

I could rotate the layout 90 degrees and only have sun glare half of the day but it's just easier to either close the curtain a bit or crank the brightness


If you sit next to a big window like I do, having the extra brightness is extremely helpful.


It's sometimes nice to have the option for high brightness, as the situation calls for it. But there's no need to run it at full blast at all times.


My problem with the brightness is that it pulls up the minimum brightness available. Most of my devices are too bright for me at their minimum settings at night time.


Can't you always lower the brightness in software, at the OS/driver level? Unless you're saying that this compresses the dynamic range of brightnesses too much, to the point where you can't distinguish shades that you need to be able to for your work.


Most cheaper monitors have always on backlights that can't change brightness.


Get a gaming monitor with ULMB and you'll never have this complaint again. It takes a huge chunk out of the brightness.


I'm glad to know that. It doesn't matter if it's the nvidia or ati version?


Got some suggestions?


On a Mac, there is the Shades software, which I used for some time (I think it interfered with Flux, so I stopped). It's quite effective.

http://www.charcoaldesign.co.uk/shades


That's unusual.


Not at all.


Same here. I don't see the point in having a bright screen unless you're in sunlight or have a lot of glare.


I do the same thing - I've found that reducing the brightness as much as possible helps with eyestrain (assuming everything is still readable).


You can get it to go get further by decreasing contrast.


I'd rather stick with 2x 25" Dell UltraSharp U2515H 2560x1440 monitors[0].

2 of those are nearly 3x cheaper than the 1x 38" curved display and for tasks that require a lot of horizontal space (video editing, etc.) 5120 is WAY wider than 3840. Also a curved display is pretty questionable for professional image editing (it distorts pixels).

I run the 25" here for development and video editing. Probably the best development environment upgrade I've made in 5 years.

If anyone wants to read a deep dive on how to pick a solid monitor for development, I put together a detailed blog post[1] that covers why I picked that 25" Dell specifically.

[0]: http://amzn.to/2jF3WHp

[1]: https://nickjanetakis.com/blog/how-to-pick-a-good-monitor-fo...


> Also a curved display is pretty questionable for professional image editing (it distorts pixels).

Isn't the point of a curved display that at the designed viewing distance, each point (at the same vertical position, at least) is equidistant from the viewer, eliminating the distortion that occurs that occurs when viewing a large flat screen.

It would seem to me that for professional image editing where you often need to zoom in to a higher magnification than an image would usually be viewed at by the target audience, a curved display would be very useful.

You probably also want to verify work on a display and conditions (viewing distance, etc.) that match the target environment.


I'd have to try this myself but I use straight guidelines all the time and I can't imagine having them look skewed if I turned my head from the precise targeted distance.


What are your thoughts on curved monitors for development/text based work?


>> 2 of those are nearly 3x cheaper than the 1x 38" curved display

For about the same price as the 38", you can get a pair 27" Dell P2715Q 4K monitors which will give you more than double the resolution of that 38" monitor.

I have a pair of P2715Q's mounted side by side in a "v" shape (not as good as curved, but does the trick) and it works fairly well. Of course, having no bezels in the middle where the monitors touch would be nice. The main drawback of the P2715Q's is that they don't have HDMI 2.0 inputs, which means your system needs two DP outs that can support 4k/60.


To me, it's the ability to put my current "focus window" exactly center what makes the curved screens so attractive, with the 32-38" size range always feeling just about right to me. My neck starts aching if I have two screens, making me end up using one as the focus screen and the other as support. While three 27"s in a row seem slightly excessive "screen wall" to me, but some like that...


I guess it really depends on your workflow. My own preference is to use my left monitor for source code and shell windows and my right monitor for browser testing. That does mean I tend to spend a significant amount of time looking to my left.


I have two of those on arms on my desk. Another great advantage of multiple monitors, I can just switch between devices (Thinkpad, MBP and my gaming rig).

The only thing I hate on U2515Hs: they could be more rebust. The plastic hull is not so great.


Yes, but if $ isn't as much of a concern, then what can beat Eizo? :)


And here I am with a Vizio 40" 4k[1] that I got for less than $500. It's not perfect, and the refresh rate is only 60Hz for 4k (which is the same as this Dell offering), but it's served me well for the last six months at a fraction of the cost and with more pixels.

1: http://www.rtings.com/tv/reviews/vizio/d-series-4k-2016


I've done the TV as a monitor thing, and the value is good, but the picture will be nowhere near as nice as a real monitor. Color reproduction is usually really inaccurate and even when you turn all the sharpening and post-processing off that they will let you, you still have an inferior picture.

That said, for gaming they are a solid choice and $500 is obviously a whole lot cheaper.


I would not recommend a TV for gaming unless one happens to find latency benchmarks for the display. TV's tend to add enormous amounts of delay their buffering and pre-processing layers, latencies that really don't matter if you're displaying a movie.


rtings.com has latency benchmarks for most that I saw, including separate ratings per display mode (1080p, 4k, 1080p game mode, etc). It was invaluable when I was researching what to buy.


If you're just using it for something like coding and browsing the web is it good enough?


I found the site I linked, rtings.com, to be invaluable in getting the specifics of television lines and their capabilities and qualities. Different TVs will have varying levels of picture quality, and making sure you turn off (and can turn off) various settings that help a TV experience but hurt of computer monitor experience (motion blur, etc) is important. The one I chose had a fairly low display lag for 4k, and supported a "game mode" (which many do). It takes a bit of research to figure out what you should be looking for, but there are some good resources you can find by googling (articles, subreddits) that let you know what are likely the most important attributes in general, and you have to supplement that with your own specific needs.


I don't have one, coworker does, but for coding it shouldn't be too bad. Browsing usually means more scrolling and so you actually will get tearing but if you don't care then should be fine.

Your other reply talked entirely about games so I responded to your actual question.


I wasn't talking about games at all. I was talking about input lag, which for TVs as displays manifests in some negative effects, most notably mouse lag.

Many TVs have a mode to reduce this lag and turn off other processing effects that are undesirable when playing games, and that tends to map well to the settings that make for a good computer display experience.


Aah good to know. It wasn't clear from my read of your other comment


Agreed. I bought a Samsung (UN40JU6700) 4k60p curved monitor for ~$800 and it works reasonably well. There are some quirks using a TV as a monitor and I was a little nervous about the screen quality, but it's worked well so far. Also, it's a TV.

edit: I especially like that the DPI is the same as a 27" 2560x1440 monitor.


I think I have the same thing. I really like mine, but without the PC mode the picture would be terrible.

It's really an awesome hidden feature that turns this TV into a really good monitor for the money. Especially for coding and productivity apps.


Have you been extremely annoyed by software that detects a 4k resolution and scales everything to HUGE? I had trouble with this when I got my 40"/4k, and in the end did unspeakable things to the registry to force Windows to always scale 1:1.


No, actually. I don't think I've experienced that at all. The biggest problem I have is that it's actually a docking station setup, and when I pull the laptop out, and have to scale back down Chrome's zoom from 130%, and change PuTTY's font. I may have googled and turned off that in the beginning, but I definitely didn't need a registry hack, at least for Windows 10.


In my experience (routinely using both a 4K iMac and high resolution dell xps with win 10) the mac is still better at doing the "right thing" on its own, resolution-correction wise.


I'd actually go a bit bigger with a 4k to get the pixel density that I want but I like this.

Only downside is gaming... One lower density monitor performs better no matter the graphics card.


Only 60 hz. Sigh - I hope mainstream moves to 100+Hz monitors soon. I love the fluency of my far-too-expensive 'gaming' IPS 144hz monitor despite I don't play that much. Just the fact that web pages and mouse pointers don't lag is a noticeable improvement.


I rather we invest in PPI rather than refresh rate. Not that the two are mutually exclusive, but as I look at my use cases, most of the time the screen is static, so a higher refresh rate would be wasted, whereas higher PPI screens improve readability and legibility (when they work on Windows).


You mean 4k on a desktop is not dense enough for you?


4k / 28" is 157 PPI, which is only indistinguishable at around 2 feet view distance on average. Better eyesight puts that distance further back.

8k at 28" is "retina" at just under a foot.

So depending on how far you sit from your screen PPI can matter more or less to you, but when we finally get 8k panels we should also finally be getting people in retina range most of the time.

8k / 120+hz is the holy grail for me.


I can see Apple bringing 120hz to the iMac, so 5k at 120hz would be getting close.

I wish they sold screens still.


Apple is still using panels by Dell, as far as I know. So that 5K Dell display is the same panel that is used in their iMac range.


They are LG panels ultimately. There were some Samsung panels previously but since 2015 they have all been LG.


Luckily they're bringing back a display with the new Mac Pro next year.


Not for a 27” panel and above. 5120x2880/3200 for a 27” is good, but for larger panels (29”, 30”, 32”), an even higher logical resolution is needed.

That’s for 2x scaling, of course, but I don’t think fraction scaling creates a good end result, unless everything is vector, which decidedly, very little is.


This isn't 4K or even 4x HD. There's not enough vertical resolution. Consumer 4K is 2160 vertical pixels. This is only 1600. If you write software for a living, vertical space is precious!


No, it's not. Because a 4K "retina" (aka HiDPI) display is only 1080P. 1080P on anything larger than a 24" screen is a non-starter for me (thanks but no thanks to huge icons on screen). I'd ideally like a 27-30" 8k "retina" monitor (aka 4K useable) with DCI-P3 color gamut. Refresh rate I'm ok with 60hz, but wouldn't object to higher obviously.


I agree. 60hz is fine but having tried +100hz CRT monitors back in 199X I am still waiting for them to be mainstream. Perhaps Apple will take the iPad's "ProMotion" 120hz to the mainstream crowd in a not-too-expensive display package.


Yeah, 60Hz looks like a slideshow after using 144Hz for a while.

It does get rapidly better after 60, though. I think I can stop telling a difference at around 80. So there is some hope that we can have 4k@better-than-60 in the near future.


Using text, I am fine with 5fps if it is not blinking like a CRT did. My kindle I think does 0fps when not changing pages and it is 100x better than a monitor. I have found the 300fps thing a bit silly for computers meant for text (which is my use case, not claiming it is the ONLY use case).


I bought the U3415W when it first came out. At first it was for games (coming from 3x U2412Ms) however I quickly realised how incredibly good the 21:9 3440x1440 resolution is for programming. No DPI scaling needs be involved, so I'm looking at Visual Studio experience where even with NCrunch unit test runners and the Solution overview, I have plenty of room for two main code editing windows. Brilliant.

I ended up buying an extra Acer X34 for home (surrounded by 2x U2412Ms on an Ergotech stand) and brought the U3415W to work as a personal device.

The 38" could potentially be even better, however I'm rather happy with the 34" as is. It's a bit of a shame they didn't add Freesync to it.


I've had the Dell U3415W (3440x1440) since it was released two years ago, and it is the best monitor I've ever used. The resolution is great, and avoiding the DPI scaling needed for practical use of most 4K screens is key. As a bonus, it is absolutely perfect for movies in 21:9.

If you haven't seen one, a more familiar comparison might be that the U3415W is essentially the same as a 27" IPS panel, only wider. It has 1440 pixels of height at roughly the same physical height as the 27" IPS panels.

Many people argue that curved screens are unnecessary or a gimmick, which I would agree is true for TV screens with multiple off-center viewers. From my own comparison of several 32"+ monitors, I do think the curve is very useful in reducing neck/eye fatigue.

The new 38" does sound appealing, but at 2-3x the cost of the U3415W on sale I would probably still recommend the 34" to most people.


>> The resolution is great, and avoiding the DPI scaling needed for practical use of most 4K screens is key.

While ymmv, I have been using a pair of 27" 4K Dell P2715Q's as my working monitors running with no scaling (1:1). I have a standing setup (although lately I've been using a drafting stool because of plantar fasciitis, but that's another story) so I do have the monitors closer to my face than most people, but I find it's more than tolerable.


Out of curiosity what font sizes do you use for your editor? I have a 27" 4K monitor around 20" from my face and find it too small without scaling, 1.5-1.75x is around perfect for me (although support for that under Windows and Linux is hit or miss).


In Sublime for Windows, I am using the default font with a size of 10.5. I have always tended to use higher dpi screens on laptops (i.e. 1680 and 1080 @ 15" running at 1:1) so I have years of conditioning to small text on screens.

An added piece of information - I got presbyopia (in addition to my existing myopia and astigmatism) about 5 or so years ago, so I now use reading glasses, which also helps a little.

I had a special pair of single vision glasses prescribed to me from my optometrist where its optimal range of focus is between 21" and 27". I got those distances from measuring the closest and furthest points between my eyes and monitors. I see everything fine. Before I got the single vision glasses, I was using a flip up reading glass attachment that I wore over my normal progressives to see the screen, which a - looked goofy, b - was unwieldy, and c - not as good as the single vision glasses.


I had a colleague who did the same. Size 10 on a 27" 1440p monitor. Whenever I had to pair with him I always had to ask him to increase the font size.

I also have myopia and astigmatism, but if I use a smaller font eventually my eyes get tired and I get headaches. My optician suggested I wear slightly weaker than my prescription glasses for computer work which helped a bit, but still I find it more comfortable to stick to my size 14 at 1.75x scaling :-)


I ended up getting 2 Dell 43" monitors and are very happy with them for coding. I actually have 3, but the 3rd is too much (I never thought I'd see the day).

They aren't perfect, and the "old" version is buggy, so if you want these, get them straight from Dell. Resellers still have the old version that drops connection.

Also, the HDMI is only 30Hz while the DP and mDP are both 60.

Finally, they stretch a few pixels past the border, so you will need to shrink the picture slightly with the video card driver.

If you can deal with all that, they are incredible. Dell has a window manager that allows you to assign a grid and windows will snap to their respective cells once dropped into it.

http://accessories.us.dell.com/sna/productdetail.aspx?c=us&c...


Thank you for posting this. I've recently been considering buying one (Dell 43 Ultra HD 4k Multi Client Monitor P4317Q, right?) However, this negative review is holding me back. Is there any merit to its claim?

This monitor has glorious resolution made totally useless because the backlight uses PWM Pulse Width Modulation to adjust the backlight brightness. This leads to eye fatigue and, in some cases, severe headaches. One would be hard pressed to find other monitors in Dell's lineup with this outdated technology since it has almost entirely been removed from all modern monitors. As a business monitor, it is expected to have "Comfort View" aka "low blue light" mode to reduce eye stress during a long day of use. Fix these problems and you will sell more of these than you could make.

http://www.dell.com/en-us/shop/dell-43-ultra-hd-4k-multi-cli...

I'd be hooking one or two up to my late 2013 MacBook Pro Retina 15, btw, which has a miniDP, so I believe I'll be able to get full resolution at 60Hz.


Yes, that is the model. I'm sure different people have different sensitivities to different things. I work in front of them all day. The only complaint I have is they are so tall (I'm using arms) that I tend to look up more than usual. Also, you can't daisy chain them. Each one needs a dedicated port on your card (I'm running a single GForce 1080)

The real estate is amazing. I have my main one set to VS taking up about 2/3rds of the screen, then two large windows stacked next to it. The secondary monitor is divided by 6 portions, so I have Pandora, Email, Browser, Postman, Excel (for billing), Windows Exploder, SSMS, whatever. I'm typing on this in one of the 6 portions with VS opened on the main monitor. I rarely fill everything up. Also, I can make VS full screen and do split windows within it to compare code. I think the Dell Display Manager only works on Windows though.

Having said that, it's 4K, but it's so large, you don't get a retina PPI. You get a regular PPI but many more pixels, which, as a coder, is much more useful to me.

Like I said, I'm very happy with them. I've finally realized maximum useful resolution, and I work on them all day and many nights.


I got my P4317Q in Aug 2016, it was that early faulty A00 revision with PWM (flicker) issue. I took a flickering video on the phone camera with lowered exposure and filed a warranty ticket. Local (EU) Dell service center picked up the display, confirmed an issue and one week later I got a new screen of A01 rev. from Germany with no issues. Superb monitor, just make sure to get A01 rev.


A happy recent HP Z34c convert here. I realized it had roughly the same number of pixels as my previous two monitor setup (older HP LP series in 24" and 22" portrait mode) and made the upgrade. This screen looks better, runs cooler, has a cleaner physical footprint (except for the useless speakers on the side bezel--I wish I could cut them off), and requires half the cables. From HP's refurbished outlet at half price with PC purchase the price is tough to beat.

I did have to scale this one up by a few percent to save my eyes, but overall I found as you did that the orientation works really well for code window plus emacs/shell or browser. It's pretty impressive to flip an HD video to full screen once in a while.


Another U3415W owner here, it's the best monitor purchase I've made in years. I previously had a 27" and two 24"s and replaced the 24s with the U2415W. I do programming and gaming on it and it's excellent for both.


But isn't this a software, not a panel resolution problem? Fonts and icons could be scaled down in software just fine and could keep excellent readability.


In theory yes, but in practice I find that Windows 10 just renders fonts nicer when not in a scaling mode. Your tastes may vary, of course.

The other thing is that for games, 3440 x 1440 is much easier on the GPU than 4k+ resolutions.


Microsoft made a mess of the high dpi support in Windows, even with WPF which is supposed to be resolution independent I still have problems.


I suspect that in a few years a hololens / google glass style device will become cheaper and work as well as a big bank of monitors, and at that point it is going to rapidly replace physical monitors - and then economy of scale and iteration of the product will do the usual to the price/performance ratio of head-mounted devices, then screens go the way of CRT monitors when flat screens came along.

I'm not saying that a head-mounted device and virtual screens will necessarily be better than a bank of monitors - in fact it's time to asses drawbacks - but once it's cheaper and seems "just as good" then business will want to switch over, for better or worse.


I wonder if an AR device that only renders stationary floating text could be produced with much lower GPU (and thus power) requirements than the current crop of headsets.

What I want is a computer + headset that can be fully powered by moderate exercise. From what I've read, manual labor averages about 75 watts of mechanical power throughout the work day, so with conversion losses, maybe 20 watts produced by pedaling or rowing action. Then I could "sit at the desk" all day without wasting away, and do it from the middle of a forest or the top of a mountain if I wanted to.


Perhaps it can also solve myopia that is common among nerds. At least if the focal point of the virtual monitor is chosen at infinity.


Will I still need to be sitting at a desk with a keyboard and mouse?


"Need" is a strong word, but you'll probably want to if you're trying to be productive. VR/AR won't replace the tactile feedback of a good keyboard, etc.

Then again, as someone who can't even use current VR/AR offerings (high myopia, don't use contact lenses) and is only following all of this as a fairly disinterested spectactor, it looks to me like the initial craze is already blowing over and people are taking a better look at the disadvantages and limitations of the current products. I don't see VR/AR taking over monitors or much of anything else, though I'm sure AR will find productivity applications in specific fields soon enough.


perhaps with gestures and voice input integrated into good software applications , the keyboard will altogether be eliminated..


There isn't a voice input in existence that I'd be interested in using while writing code or doing photo/video editing.

Even dictating code to another human who is as fluent as you are in the language of choice is a non-starter, so I don't see any machine based way of doing it catching up soon.


I can't speak as fast as I type, which makes any kind of dictation solution a non-starter.


The only solutions I've seen that even "come close" in the speed / accuracy comparison are unfair comparisons.

An example was posted by someone in another comment of a guy doing Python coding with Emacs using Dragon Dictation. In that example, the guy used over 2,000 custom "macros" to get to something close to his pre-RSI speeds. This sounds fantastic, but the speeds of a typer who has learned to leverage 2,000 custom macros would be an order of magnitude faster (imho) than someone dictating using similar macros.


This is the principle behind https://en.wikipedia.org/wiki/Stenotype machines.


Exactly!

There are also specialized keyboards for computers that have existed over the years that have had typists reach 300-400wpm, but they never caught on.


Neither gestures nor voice input will ever eliminate keyboards for the simple reason that keyboards are vastly more efficient. No technological innovation can ever change that because the limitation is on the human side of things.


I for one much prefer a physical keyboard with a good tactile sense, preferably cherry brown switches. So I would keep that.

Picture the same setup, just with the monitors removed. Maybe a white plastic sheet as backdrop for virtual screens.


You haven’t tried any futuristic solution, so how would you know that you’re not going to like it?


Futuristic solution to keyboards? I've tried lots of keyboards. Probably going to try lots more, there are frequent times when you have to use the only keyboard that's there.


yeah, I’m using the screen keyboard on my iPad right now. Actually, I just dictated most of this particular response.

Once again, you haven’t used the futuristic solution so saying that you must use a keyboard seems a little premature.

if you’re walking around with a pair of Google glasses, you don’t have a keyboard there.


Saying and implying there's going to be a popular futuristic solution is premature as well.

Keyboard has been around a LONG time. We're past the halfway point to the age of the Jetson's and we still have them.


Horses were around for thousands of years. Henry Ford said people would just wanted a faster horse.

Most people lack imagination.


Axes have been around a lot longer and they're still being used even now with chainsaws.

Some people seem to thing everything always needs to change.


And people still ride horses. So, what you’re saying is that we don’t need chainsaws because axes are just as effective?

The pesky unreasonable people trying to always change things:

http://www.goodreads.com/quotes/536961-the-reasonable-man-ad...


It isn't unreasonable at all to think that keyboards will continue to be one of, if not the most, efficient ways to use a computer.


> so saying that you must use a keyboard seems a little premature

What part of "I for one much prefer a physical keyboard" implies "you must use a keyboard" ? I expressed a personal preference, that's all. I don't expect my preference to change, but sure, anything is technically possible though unlikely.


The part where you say to prefer a keyboard to some futuristic solution that doesn’t yet exist. You can’t have an opinion about something that doesn’t yet exist.

Before someone dreamed up the automobile, you would we telling us how you prefer the horse, perhaps just make it faster.


> You can’t have an opinion about something that doesn’t yet exist.

Well, exactly. And if it seems like I expressed that kind of opinion, I apologize, I did not intend to. I'll certainly try the new things.

> you would we telling us how you prefer the horse, perhaps just make it faster.

It seems like you know me well, certainly better than I know myself.


No, but you'll want to be at a desk or a chair with a keyboard mount.

Typing is always going to be more efficient for precision data input until we get to the third or fourth generation of direct brain interface... and nobody wants to be a beta tester for that.


Voice, eye tacking, and gesturing don’t offer possibilities?

https://m.youtube.com/watch?v=0QNiZfSsPc0

http://ergoemacs.org/emacs/using_voice_to_code.html

And yes, if you’re clicking away in a room of programmers, it won’t work for you, but with a Hololens I’ll find a quiet location.


Oh, we've got just the thing for that: http://i.imgur.com/LZr6ks4.jpg


PSA - don't get tempted by UltraSharp reviews and recommendations, like I did. Just pick an Eizo instead.

UltraSharps are widely recommended for coding with glorious reviews and endorsements. Got one and no matter how I adjusted it, it was still too... eye-piercing, if you will, for longer coding sessions. Got mild headaches, tired eyes and general feeling of discomfort when working on hem even for shorter periods. Then switched to FlexScan and it's a completely different ballgame - softer, more gentle feel, incomparably more comfortable. The best monitor I've had a pleasure of staring at in my 20 years of programming.

However the interesting part here is that both monitors use the same panel (!), so the panel itself is only part of the recipe, which is something that many reciews tend to either downplay or not mention at all.


Definitely. I made the mistake of buying an UltraSharp U2713HM which actually buzzed with lots of text on the screen (apparently it's a feature, just google the model). I tried swapping my new display and they sent me a refurbished one with dead pixels that buzzed also. I'm never buying Dell again.

This time around I did the right thing and went the more expensive route and got an Eizo and it's just absolutely perfect.


I had the same experience with the same model. Never again, Dell.


I suspect it is the AG (anti-glare) coating. Dell keeps using a cheap version (3H)...


Does Eizo have anything in the 38"-40" range? I don't see it on their site.


I'd much rather see one with higher resolution. The DPI is quite low with today's standard.


Sadly, outside of macOS, the support for high PPI display is dodgy at best. On Linux, it is becoming better, but still a lot more to improve. On Windows, it's a real shitshow. Windows 10, for the most part, is OK (still a lot of inconsistencies and lack of polish when scale > 100%), but application support is just terrible, be it blur, tiny scale, inconsistent elements (some small, some large) or plain crashing. Really, any vendor that makes such display is taking a risk that people will complain and a sizeable portion of the customers that purchase such a display will not understand the technical difficulties.


On Linux, it is becoming better, but still a lot more to improve.

Even on Fedora 26, I have to manually patch and recompile Mutter to get HiDPI support on Wayland. Mutter is hardcoded [1] to use 2x scaling on displays that are 192 PPI or more. My Dell p2415q is 188.2 PPI, so you get 1x 'scaling' by default and everything is tiny. This problem has been known for a while [2].

And even though GNOME applications then generally work well, you have to start Chromium with a special flag to let it run with scaling. There are a lot of glitches throughout the system, e.g. the mouse cursor has the right size, but when you go to a window corner to resize a window, it becomes tiny. A lot of icons are blurry, because they have a low resolution and are scaled up.

macOS is basically plug & play. All applications are in full HiDPI glory. The only exception are websites that use low-resolution images. There is one catch: the (terribly expensive) Apple USB-C Digital AV Multiport Adapter only supports 4k at 30Hz. This refresh rate is very tiring and annoying. However, a USB-C -> DisplayPort ALT mode connector works great @ 60Hz.

[1] IIRC there is work in the master branch to solve this.

[2] https://bugzilla.redhat.com/show_bug.cgi?id=1258155


Chrome/Chromium is notoriously bad at adopting any technology. On Windows, it took them years to support scaling other than 1x, color Emoji was added very recently, etc. I think on Linux it is even worse. Have they finally fixed ligature support in Chromium?

With Firefox rapidly improving in performance and memory usage, less and less reasons to use Chrome.


I set Xft.dpi: 192 in ~/.Xresources (gnome-settings-daemon and others will perform the equivalent setting).

Chromium respects this (for many major versions by now) and shouldn’t need a separate flag. Are you certain the flag is necessary in your setup? Can you check what Xft.dpi is on your system? xrdb -q | grep '^Xft.dpi'


On Linux, it just depends on your desktop environment.

If you use KDE or any other Qt based one, HiDPI works perfectly at any scale, even in heterogenous environments – a landscape 4K 144ppi and a portrait 70ppi monitor right next to each other will work fine, and when moving applications they automatically rescale. On X11 and Wayland. The scale is set via the environmental variable

    QT_SCREEN_SCALE_FACTORS=DisplayPort-2=1.61;HDMI-A-0=1.08;
It’s GNOME and GTK which have almost no HiDPI support at all – they only support 96 and 192dpi (only integer scaling), and just added a feature for all other resolutions to just scale the windows down with blur. GTK2 still supported proper HiDPI, but with GTK3 they decided to abandon that and instead build apps that rely on pixels always being the same size. Considering how young GTK3 is this decision may sound a bit short-sighted, but the GNOME devs always refer to macOS as an example where this worked (while ignoring Android and Windows 10, which use fractional scaling, and ignoring the fact that with Gnome’s solution a game such as Minecraft on a 4K 144dpi screen is rendered at 6K and then half the resolution gets thrown away during the downscaling, because there is no option to avoid this)

The result is obviously completely broken.


I run Mate on two 4k monitors and it looks great, you just raise the font size in the control panel until readable.


That's not exactly true. On Linux, it's a real shit show because only KDE (Qt) supports proper fractional scaling, GTK apps don't. GTK supports 100%, 200%, and so on. You can play with xrandr, but that'll cause serious other issues (not even speaking about multi-monitor setups).

On Win10, however, all of the programs I'm using have flawless hidpi support - browsers, IntelliJ, console windows (cmder), Spotify, etc. W10 supports per-display DPI as well.

I don't know which applications are "crashing", that sounds like FUD. I know about two notable exceptions that don't have good hidpi support: Adobe tools and Hyper-V.


I'm not sure what "not exactly true" means, it's my personal experience of trying to run Windows 10 on my MBP. Many of the video tools I use regularly (for encoding, subtitle creation and editing, authoring, muxing, etc) are either broken or look bad. I understand that may not be mainstream use, but it certainly isn't esoteric either. You say browsers, but last I remember, Chrome didn't really support it. Now I am on Firefox, so it's proper, but that's not what most people use. Adobe software, as you said, is notorious at supporting scale. That's not esoteric either.


Chrome's hidpi support has been acceptable on Windows since about 2015. It's still not absolutely perfect (there are issues with 1px black bars when resizing Windows), but it's completely usable.


I've been using windows 10 with a QHD screen at 200% scaling, and it I can't complain.

Aside from a few apps that fail to scale again when I disconnect a screen from a laptop, all the apps work as expected.


I think when two monitors are connected, Windows still has issues with correctly transitioning the software from one scale to another. But that's at the system level and can be solved. The main issue is legacy software that would require complete rewrite to support scaling. A bit issue in those cases is that the community is completely unwilling to support such displays, with silly claims such as "you can't see the difference".


Rescaling between two monitors that are extending each other, even with different scales works well, its when one get disconnected and the window is forced to move to another screen that causes issues.

As to legacy software, most software works fine. Those that never cared about scaling are handled by the system easily enough. It's those rare cases where the legacy software is saying they support scaling to solve some issues with them not using the windows apis but not really doing anything to actually support scaling, such as GIMP.

Those are rare though, and usually are software with it's own rendering stack that don't use the windows apis.


I meant when moving an application window from one screen to another. I saw some issues that didn’t exist when I worked only on the MBP screen, but when I connected my 27” external display, they were suddenly issues noticeable when dragging windows.


This is simply wrong. We switched from Macs to Windows 10 because of the better monitor options available for professional color grading work. And true "Cinema" and consumer 4K resolutions are supported and supported well ONLY on Windows 10.


I respectfully completely disagree as do most professional photo/video folks I know. Color grading support on Macs is superb. There's also no commonly used resolutions supported on Windows that I know of that aren't available on Mac, so I'm not sure what you mean by the last part.


This is a resolution you can use 1:1 without DPI scaling. The key here is massive real estate.


That's how I use a 4K @ 32" (16:9). It's just on the side of being comfortably usable. If the density were any greater, I would have to use some HighDPI mode and sacrifice real estate.


That's awesome. Would love to try one of these one day, alas I haven't seen them in the flesh yet. I sit pretty close the screens so have been wondering how a large 4k at 100% would work for myself.


In some ways, I imagine the Ultrawides (especially the curved ones) are probably more usable, because quite often you could get by nicely with a purely horizontal array of windows, and not have to be thinking about how to most effectively make use of vertical partitioning too.

When spending the money, I didn't feel like hitching myself to a "weird" aspect ratio, but maybe Ultrawides are here to stay.

More

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: