Historians in a millennia will probably look back at us as heathens for deviating from using the perfect rectangle to construct our screens.
So perhaps sit it out and wait for 5k? ;)
You're right about the length of the longer axis, though... I've rarely found it all that useful to have one view fill the entire screen. So I generally either have two 1440x1280 windows/panels/etc. on a screen, or one 1440x~1700 and one 1440x~860. That works pretty well. Many window layout tools will let you quickly set up this sort of arrangement.
I've been turning down monitor "upgrades" at work for years because the "upgraded" monitors are 16:9 and would be replacing working 16:10 monitors.
IMO bitching about aspect ratios seems petty when new monitors offer so much more space than older ones.
Unfortunately, the only upgrades I've been offered are 2x 24" 1920x1200 -> 2x 24" 1920x1080.
The thing I've learned is that really I want a specific pixel per inch real world or I want a specific font size in inches real world. I sit comfortably back and the dell 30 lets me easily set my text at about 1/8" tall or 1/4" which is super easy to read relaxed in my chair. If I went for a increased resolution display, the font would just get smaller until I get to a 4x density... and I'm not really gaining anything until then either when AA smothing is a win.
But also since I jack my font up even more than usual for my distance.. I'm already anti-aliasing about 4x AA, so a 4x resolution bump would put me at 16xAA which is, I don't think, needed. I'd rather blanket my field of view with lots of large text than have one very nice high density display central that makes me lean forward and strain my eyes.
If people really care about 16:10, it's available with reasonably good specs and they can buy it. If they don't, it will probably die permanently, but we can't exactly blame the market.
What does it take for 16:10 to come back? Or is Desktop monitor as we know today are forever going to be stuck with 16:9?
A couple caveats -- Because I don't game or do graphics work, I don't know or care about color reproduction. I made sure to get a TV, graphics card, and cable that supported 4k@60hz with 4:4:4 chroma so the text is perfectly sharp and the refresh is not obtrusive. It's not hard to find a Samsung TV around $350 that supports that, but as far as I could tell, I had to go up to a GTX1050 graphics card to get that. I sit between 24 and 28" away from the monitor so the text is readable to me at full resolution. At that distance, I do have to turn my head to comfortably read text in the corners of the TV, especially the upper corners. In practice, that means I keep monitoring-type applications such as CPU charts, Datadog graphs, etc., in one of those corners for quick reference. While I still have two 1920x1080 monitors on either side of the TV, it's quite nice to be able to open up on my main monitor a huge IDE window and, when necessary, put 3-4 normal-sized windows next to each other for comparison work.
When I replace it, the replacement will be the same size and resolution, but I'll probably go for OLED and HDR-10 or better.
Was very excited for it as it featured a very similar DPI to a 27" 1440p Korean import, coming from 3+ 1080p monitors originally the lack of bezels was incredible.
It's not built for gaming at all though, extreme screen tearing and poor response times has me now looking to replace the 27" 1440p Korean import with something >144Hz and G-Sync, before replacing the Philips BDM40.
My old setup consisted of two monitors, one 23" and one 21". One day I realized that my 23" monitor could not reproduce colors. I needed to photoshop a picture. I sent the retouched version to my phone, but as I couldn't see the color depth on the 23" one, the image looked ugly on the better screen. At that time I learnt not to buy the cheapest monitor, just by looking at its size and resolution. So, I sold that monitor, and bought a 32" 1080p monitor at a bargain price.
Of course it would be better if I could buy a 1440p monitor at this size, (or even 4K OMG :)). But considering my budget this was all I could purchase.
I had serious doubts about the pixel density, however there was nothing to be afraid of. In usual tasks (e.g. browsing, writing code in emacs, terminal) nothing disturbs my eye, it's beautiful. However, if I open a 1080p youtube video fullscreen, it looks like as if it were a high quality SD video, because of the pixel density. I am standing close to the monitor, and although I cannot see any pixel points, I can notice the difference.
I bought my monitor for 900 Turkish Lira, equivalant to 254 dollars and I think it is a great investment.
For productivity I'd go for the 40" 2160p over a lower resolution higher refresh panel.
TVs, on the other hand, are designed to show the most oversaturated, "vibrant" colors on the demo loop on the show-room wall. And mechanically, they're designed to hang on the wall and never be touched.
Plus it's curved.
That’s completely useless for graphic design. sRGB is defined as the lowest common denominator of CRTs on the market in the early 90s.
Actual monitors for graphic design purposes, every TV released in 2017, and most new gaming monitors use instead AdobeRGB or DCI-P3 colorspace or even Rec.2020, which is what is also used in cinemas.
For comparison, this is 100% sRGB (what you claimed is "for graphic design" vs. 100% Rec.2020 "what most new TVs and gaming monitors support"): https://dotcolordotcom.files.wordpress.com/2012/12/rec2020-v...
It's called an iMac.
Not too long ago the curved models used to be cheaper with big discounts, at least for TV's, because nobody wanted to have a curved TV.
* removed mention of FreeSync as this display lacks it
By default. A couple of clicks in Preferences will get you higher or very-high-indeed effective real estate.
High DPI is important for me, though.
A recent positive:
A negative from 2014:
and some discussion:
The main headache seems to be blurry fonts which probably mean a lack of 4:4:4 chroma support or the inability to enable it.
TVs are said to be configured to display the best-looking colors, monitors try to stay true to the real color.
TVs have an immensely higher refresh rate, 50hz/60hz usually suffice for a TV. But on monitors we speak of milliseconds. The refresh rate makes a lot of difference, if you intend to game on your computer/monitor. An e-sport professional Redittor claims he had improved his playing performance multiple-folds after switching to a monitor instead of TV. (https://www.reddit.com/r/FIFA/comments/5whb9v/did_the_change...)
Other than these differences and the occasional overscan/underscan problems on older televisions, I personally see no reason to prefer a Monitor over a TV, if there is huge price difference. If the price difference is minor, I'd opt for a monitor.
>>Over a decade has passed since the LCD monitor unceremoniously ousted the boxy CRT monitor into obsolescence, but with that ousting came a small problem: CRT monitors redrew every frame from scratch, and this was baked into the fundamentals of how PCs sent information to the screen. Monitors redrew the screen with a refresh rate of 100 Hz, (100 times a second), and they were silky smooth.
>>LCD monitors don’t have this problem, because the pixels don’t need to be refreshed.
Instead, they update individual pixels with new colors, and each of those updates takes a certain amount of time depending on what the change is.
The response time refers to how long this change takes. LCDs started with a 60-Hz refresh rate but a response time of about 15 milliseconds (ms).
È una tassa, pagatela e smettetela di lamentarvi.
Granted that's 1080p, but it wouldn't surprise me if the same thing is true of 4k.
Also that cost is hidden in a phone, i'm not sure what the value of screen components would be.
Obviously it is harder to make a 1 inch x 1 inch display with 1 million pixels vs 1000. But maybe the difference from 600ppi to 200ppi is not enough to matter...
HDMI 2.0 is a pain compared to DP, other than that, there's almost no reason not to buy a TV if you just care about office use. I've been using a Samsung UHD TV for almost two years.
TVs play content of the same frame rate, so there's no reason for them to be any more precise. Monitors can go at very high frame rate as the content can be very fast paced, especially gaming monitors.
Freesync/gsync also integrates with the graphics card to reduce lag. This is the most expensive part of these monitors.
And this Dell isn't even 10 bit IPS, which is what monitors for professional graphics use, it's the standard 8bit IPS, very likely made by LG.
It's curved though.
- TV's are generally low framerate, as much as they'd like to claim 240FPS, it's mostly all 30FPS, with software interpolation to increase the frames.
- Bulk. TV's are solid in higher numbers, justifying the lower price
- Distance from face. Your 60" TV can be two smaller panels "glued" together. Not noticeable from watching distance, but having a monitor so close to your face, you're more likely to notice the millisecond tearing.
That makes absolutely no sense. If the panel is incapable of 240 refreshes per second, how does software interpolation "increase the frames"? You are confusing content and panel.
Mainly it is that for more than 60fps you need dual-link DVI, display port or HDMI 2.0 and you won't find many TVs with either of those. So even though the screen can do more than 60fps the input and processor can't take it.
The word 'panel' is not in that part of the comment.
The hardware is incapable of getting high fps signals through most of the pipeline.
Bah, why does everyone seem to love blinding themselves? Am I the only one who doesn't like a bright screen? (Ergonomically, we're supposed to have the monitor's backlight match the brightness of a white sheet of paper in the same lighting conditions.)
Use case: I'm in front of a window, if the sun's out and I've got the curtains open I might as well just turn the screen off
For your use case, you might want to be adjusting contrast down.
And, here's the thing: I like brightness, usually. As @ianai mentions, what's needed is a larger range, not the same range but brighter.
When the computer showed up, it was first placed in the corner toward the window or in front of the window but the later location proved hard on the eyes so we stopped with that.
What has changed now that we stopped with this practice? I thougt we would have the office layout perfected by now?
I could rotate the layout 90 degrees and only have sun glare half of the day but it's just easier to either close the curtain a bit or crank the brightness
2 of those are nearly 3x cheaper than the 1x 38" curved display and for tasks that require a lot of horizontal space (video editing, etc.) 5120 is WAY wider than 3840. Also a curved display is pretty questionable for professional image editing (it distorts pixels).
I run the 25" here for development and video editing. Probably the best development environment upgrade I've made in 5 years.
If anyone wants to read a deep dive on how to pick a solid monitor for development, I put together a detailed blog post that covers why I picked that 25" Dell specifically.
Isn't the point of a curved display that at the designed viewing distance, each point (at the same vertical position, at least) is equidistant from the viewer, eliminating the distortion that occurs that occurs when viewing a large flat screen.
It would seem to me that for professional image editing where you often need to zoom in to a higher magnification than an image would usually be viewed at by the target audience, a curved display would be very useful.
You probably also want to verify work on a display and conditions (viewing distance, etc.) that match the target environment.
For about the same price as the 38", you can get a pair 27" Dell P2715Q 4K monitors which will give you more than double the resolution of that 38" monitor.
I have a pair of P2715Q's mounted side by side in a "v" shape (not as good as curved, but does the trick) and it works fairly well. Of course, having no bezels in the middle where the monitors touch would be nice. The main drawback of the P2715Q's is that they don't have HDMI 2.0 inputs, which means your system needs two DP outs that can support 4k/60.
The only thing I hate on U2515Hs: they could be more rebust. The plastic hull is not so great.
That said, for gaming they are a solid choice and $500 is obviously a whole lot cheaper.
Your other reply talked entirely about games so I responded to your actual question.
Many TVs have a mode to reduce this lag and turn off other processing effects that are undesirable when playing games, and that tends to map well to the settings that make for a good computer display experience.
edit: I especially like that the DPI is the same as a 27" 2560x1440 monitor.
It's really an awesome hidden feature that turns this TV into a really good monitor for the money. Especially for coding and productivity apps.
Only downside is gaming... One lower density monitor performs better no matter the graphics card.
8k at 28" is "retina" at just under a foot.
So depending on how far you sit from your screen PPI can matter more or less to you, but when we finally get 8k panels we should also finally be getting people in retina range most of the time.
8k / 120+hz is the holy grail for me.
I wish they sold screens still.
That’s for 2x scaling, of course, but I don’t think fraction scaling creates a good end result, unless everything is vector, which decidedly, very little is.
It does get rapidly better after 60, though. I think I can stop telling a difference at around 80. So there is some hope that we can have 4k@better-than-60 in the near future.
I ended up buying an extra Acer X34 for home (surrounded by 2x U2412Ms on an Ergotech stand) and brought the U3415W to work as a personal device.
The 38" could potentially be even better, however I'm rather happy with the 34" as is. It's a bit of a shame they didn't add Freesync to it.
If you haven't seen one, a more familiar comparison might be that the U3415W is essentially the same as a 27" IPS panel, only wider. It has 1440 pixels of height at roughly the same physical height as the 27" IPS panels.
Many people argue that curved screens are unnecessary or a gimmick, which I would agree is true for TV screens with multiple off-center viewers. From my own comparison of several 32"+ monitors, I do think the curve is very useful in reducing neck/eye fatigue.
The new 38" does sound appealing, but at 2-3x the cost of the U3415W on sale I would probably still recommend the 34" to most people.
While ymmv, I have been using a pair of 27" 4K Dell P2715Q's as my working monitors running with no scaling (1:1). I have a standing setup (although lately I've been using a drafting stool because of plantar fasciitis, but that's another story) so I do have the monitors closer to my face than most people, but I find it's more than tolerable.
An added piece of information - I got presbyopia (in addition to my existing myopia and astigmatism) about 5 or so years ago, so I now use reading glasses, which also helps a little.
I had a special pair of single vision glasses prescribed to me from my optometrist where its optimal range of focus is between 21" and 27". I got those distances from measuring the closest and furthest points between my eyes and monitors. I see everything fine. Before I got the single vision glasses, I was using a flip up reading glass attachment that I wore over my normal progressives to see the screen, which a - looked goofy, b - was unwieldy, and c - not as good as the single vision glasses.
I also have myopia and astigmatism, but if I use a smaller font eventually my eyes get tired and I get headaches. My optician suggested I wear slightly weaker than my prescription glasses for computer work which helped a bit, but still I find it more comfortable to stick to my size 14 at 1.75x scaling :-)
They aren't perfect, and the "old" version is buggy, so if you want these, get them straight from Dell. Resellers still have the old version that drops connection.
Also, the HDMI is only 30Hz while the DP and mDP are both 60.
Finally, they stretch a few pixels past the border, so you will need to shrink the picture slightly with the video card driver.
If you can deal with all that, they are incredible. Dell has a window manager that allows you to assign a grid and windows will snap to their respective cells once dropped into it.
This monitor has glorious resolution made totally useless because the backlight uses PWM Pulse Width Modulation to adjust the backlight brightness. This leads to eye fatigue and, in some cases, severe headaches. One would be hard pressed to find other monitors in Dell's lineup with this outdated technology since it has almost entirely been removed from all modern monitors. As a business monitor, it is expected to have "Comfort View" aka "low blue light" mode to reduce eye stress during a long day of use. Fix these problems and you will sell more of these than you could make.
I'd be hooking one or two up to my late 2013 MacBook Pro Retina 15, btw, which has a miniDP, so I believe I'll be able to get full resolution at 60Hz.
The real estate is amazing. I have my main one set to VS taking up about 2/3rds of the screen, then two large windows stacked next to it. The secondary monitor is divided by 6 portions, so I have Pandora, Email, Browser, Postman, Excel (for billing), Windows Exploder, SSMS, whatever. I'm typing on this in one of the 6 portions with VS opened on the main monitor. I rarely fill everything up. Also, I can make VS full screen and do split windows within it to compare code. I think the Dell Display Manager only works on Windows though.
Having said that, it's 4K, but it's so large, you don't get a retina PPI. You get a regular PPI but many more pixels, which, as a coder, is much more useful to me.
Like I said, I'm very happy with them. I've finally realized maximum useful resolution, and I work on them all day and many nights.
I did have to scale this one up by a few percent to save my eyes, but overall I found as you did that the orientation works really well for code window plus emacs/shell or browser. It's pretty impressive to flip an HD video to full screen once in a while.
The other thing is that for games, 3440 x 1440 is much easier on the GPU than 4k+ resolutions.
I'm not saying that a head-mounted device and virtual screens will necessarily be better than a bank of monitors - in fact it's time to asses drawbacks - but once it's cheaper and seems "just as good" then business will want to switch over, for better or worse.
What I want is a computer + headset that can be fully powered by moderate exercise. From what I've read, manual labor averages about 75 watts of mechanical power throughout the work day, so with conversion losses, maybe 20 watts produced by pedaling or rowing action. Then I could "sit at the desk" all day without wasting away, and do it from the middle of a forest or the top of a mountain if I wanted to.
Then again, as someone who can't even use current VR/AR offerings (high myopia, don't use contact lenses) and is only following all of this as a fairly disinterested spectactor, it looks to me like the initial craze is already blowing over and people are taking a better look at the disadvantages and limitations of the current products. I don't see VR/AR taking over monitors or much of anything else, though I'm sure AR will find productivity applications in specific fields soon enough.
Even dictating code to another human who is as fluent as you are in the language of choice is a non-starter, so I don't see any machine based way of doing it catching up soon.
An example was posted by someone in another comment of a guy doing Python coding with Emacs using Dragon Dictation. In that example, the guy used over 2,000 custom "macros" to get to something close to his pre-RSI speeds. This sounds fantastic, but the speeds of a typer who has learned to leverage 2,000 custom macros would be an order of magnitude faster (imho) than someone dictating using similar macros.
There are also specialized keyboards for computers that have existed over the years that have had typists reach 300-400wpm, but they never caught on.
Picture the same setup, just with the monitors removed. Maybe a white plastic sheet as backdrop for virtual screens.
Once again, you haven’t used the futuristic solution so saying that you must use a keyboard seems a little premature.
if you’re walking around with a pair of Google glasses, you don’t have a keyboard there.
Keyboard has been around a LONG time. We're past the halfway point to the age of the Jetson's and we still have them.
Most people lack imagination.
Some people seem to thing everything always needs to change.
The pesky unreasonable people trying to always change things:
What part of "I for one much prefer a physical keyboard" implies "you must use a keyboard" ? I expressed a personal preference, that's all. I don't expect my preference to change, but sure, anything is technically possible though unlikely.
Before someone dreamed up the automobile, you would we telling us how you prefer the horse, perhaps just make it faster.
Well, exactly. And if it seems like I expressed that kind of opinion, I apologize, I did not intend to. I'll certainly try the new things.
> you would we telling us how you prefer the horse, perhaps just make it faster.
It seems like you know me well, certainly better than I know myself.
Typing is always going to be more efficient for precision data input until we get to the third or fourth generation of direct brain interface... and nobody wants to be a beta tester for that.
And yes, if you’re clicking away in a room of programmers, it won’t work for you, but with a Hololens I’ll find a quiet location.
UltraSharps are widely recommended for coding with glorious reviews and endorsements. Got one and no matter how I adjusted it, it was still too... eye-piercing, if you will, for longer coding sessions. Got mild headaches, tired eyes and general feeling of discomfort when working on hem even for shorter periods. Then switched to FlexScan and it's a completely different ballgame - softer, more gentle feel, incomparably more comfortable. The best monitor I've had a pleasure of staring at in my 20 years of programming.
However the interesting part here is that both monitors use the same panel (!), so the panel itself is only part of the recipe, which is something that many reciews tend to either downplay or not mention at all.
This time around I did the right thing and went the more expensive route and got an Eizo and it's just absolutely perfect.
Even on Fedora 26, I have to manually patch and recompile Mutter to get HiDPI support on Wayland. Mutter is hardcoded  to use 2x scaling on displays that are 192 PPI or more. My Dell p2415q is 188.2 PPI, so you get 1x 'scaling' by default and everything is tiny. This problem has been known for a while .
And even though GNOME applications then generally work well, you have to start Chromium with a special flag to let it run with scaling. There are a lot of glitches throughout the system, e.g. the mouse cursor has the right size, but when you go to a window corner to resize a window, it becomes tiny. A lot of icons are blurry, because they have a low resolution and are scaled up.
macOS is basically plug & play. All applications are in full HiDPI glory. The only exception are websites that use low-resolution images. There is one catch: the (terribly expensive) Apple USB-C Digital AV Multiport Adapter only supports 4k at 30Hz. This refresh rate is very tiring and annoying. However, a USB-C -> DisplayPort ALT mode connector works great @ 60Hz.
 IIRC there is work in the master branch to solve this.
With Firefox rapidly improving in performance and memory usage, less and less reasons to use Chrome.
Chromium respects this (for many major versions by now) and shouldn’t need a separate flag. Are you certain the flag is necessary in your setup? Can you check what Xft.dpi is on your system? xrdb -q | grep '^Xft.dpi'
If you use KDE or any other Qt based one, HiDPI works perfectly at any scale, even in heterogenous environments – a landscape 4K 144ppi and a portrait 70ppi monitor right next to each other will work fine, and when moving applications they automatically rescale. On X11 and Wayland. The scale is set via the environmental variable
The result is obviously completely broken.
On Win10, however, all of the programs I'm using have flawless hidpi support - browsers, IntelliJ, console windows (cmder), Spotify, etc. W10 supports per-display DPI as well.
I don't know which applications are "crashing", that sounds like FUD. I know about two notable exceptions that don't have good hidpi support: Adobe tools and Hyper-V.
Aside from a few apps that fail to scale again when I disconnect a screen from a laptop, all the apps work as expected.
As to legacy software, most software works fine. Those that never cared about scaling are handled by the system easily enough. It's those rare cases where the legacy software is saying they support scaling to solve some issues with them not using the windows apis but not really doing anything to actually support scaling, such as GIMP.
Those are rare though, and usually are software with it's own rendering stack that don't use the windows apis.
When spending the money, I didn't feel like hitching myself to a "weird" aspect ratio, but maybe Ultrawides are here to stay.