"Industry's most affordable Ultra HD monitor" is semantics. The Seiki 39" is still substantially cheaper, now at less than $500 on Amazon. Technically, that is a television, so I suppose "monitor" is a clever use of words by Dell.
If Seiki drops an HDMI 2 version of their monitor—excuse me, television—for roughly the same price as the current model, things start to get really interesting. Actually, scratch that, the desktop monitor industry is finally interesting again right now. This is the first time it has been interesting since about 2005.
I prefer the higher DPI that Dell is providing, but I prefer the size that Seiki is going after. My ideal is 200+ DPI at about 50" for my desktop. All steps toward that make me happy, so good going Dell!
What? Most people I know can't even handle the current 30" monitors. It feels nicer having two 27" monitors from a not-grossly-exceeding-peripheral-vision viewpoint.
I don't like the bezels between monitors in a multi-monitor configuration. I don't like the low-DPI of my 30" monitors. And I certainly don't like the price of the Dell U3011.
When you say you know people who can't handle a 30" monitor, do you mean that? They would refuse to use a 30" even if they were given one? I've only met a few people who are of that particular disposition, and I personally consider them odd. It's a weird self-flagellation or quasi-Luddism.
In my experience, most people don't deal with 30" monitors because they have been effectively fixed at over $1,100 for the past 8 years, only recently being disrupted by the Koreans and Chinese. From that perspective, where a small 1080 monitor can be had for a couple hundred bucks, it required a special type of enthusiast to pay the steep premium for 30" displays. But it looks that we're finally getting some changes in desktop displays, and I consider this positive progress toward what I would consider an ideal. To reiterate what I've said elsewhere, I want desktop computing to use nearly-immersive display technology, and I think this is more realistic in the mid-term than VR goggles.
But to be absolutely clear, I consider 30" too small for desktop computing today.
Not at all. It's practicality. Close enough to read normal-sized text clearly, and the edges fade into peripheral vision; far enough that I can see the whole thing at a glance, and I start having to zoom to read. I'd use one if it was the only monitor available, but I'd prefer a 24".
If it works for you, that's nice. I'm not calling you a freak of nature, so please don't call me a self-flagellating Luddite.
I am emphatically against the notion of "good enough," and those whom I personally have heard express confusion about the appeal of large monitors have routinely described various technology levels as "good enough."
I don't mean to suggest that everyone who would refuse a large display is definitively a Luddite. Furthermore, I am abusing the definition of Luddite here to mean someone who is satisfied with the status quo.
That said, I cannot relate to the appeal of small monitors whatsoever. It would frustrate me greatly to use a 24" LCD for a whole day of work. All the alt-tabbing, control-tabbing, the loss of context, and window shuffling would get old, quick.
Incidentally, for me, I'd have to sit very close to a 24" LCD to have its edges fade into peripheral vision. I like having my monitors about 2.5 to 3 feet from my eyes. A monitor would need to be 50" or larger for its edges to fade into peripheral vision at ~3 feet.
Something else I really enjoy about large monitors is the ability to zoom in and then sit back further (4 or 5 feet) and relax. I like the flexibility of varying the viewing distance nicely between 1.5 to 5 feet by leaning in or leaning back depending on use-case and my feeling at the moment.
There's no fundamental reason you would have to -- if you have a window manager that extends the "drag to the side to half-maximize" behavior that Windows 7 has to, say, include dragging to the corner to "quarter-maximize" (ideally, with four appropriate keyboard shortcuts, one per corner, so you could do it without dragging, as well.)
I primarily use it to map three finger click to middle mouse button, but when I discovered it also had the window-sizing-by-dragging-to-the-edge feature I turned it on and have been very happy with it..
Seems to me that if you have several open apps then to type into one of them you have to either move your hand to the mouse and click it or you have to alt-tab. Even if you have "focus follows mouse" then you still have to move the mouse over to the window before you can type…
At least in Windows and KWin, drag to the destination window's icon in the task bar and it should come to the front... But yes, still not as convenient.
Yes. There are a few issues at play:
1. I want more screen space so I want a bigger monitor
2. My desk has a finite depth. For better posture the monitor should be far enough away that I don't need to move my head to see from a corner to any other corner
Multi-monitor set ups are probably worse, my physio sees a number of ppl who use them. But any big screen will have the same effect, encouraging neck and spinal issues.
I'm curious - if you need to put a big monitor further away and increase the font size to keep it legible, don't you lose most of the benefits of "bigger screen area" and gain only crisp rendering and clarity?
For my multiple 30" monitors, I keep them each about 3 feet from my eyes, more than an arm's length away. I suppose I have been blessed with fairly good eyesight because I don't zoom the view, instead keeping everything at 100% for most day-to-day usage. If I'm feeling like I want to relax, I'll zoom things in and recline a bit.
I guess from my layperson perspective, I'd much rather move my eyes and head than deal with the contextual and organizational problems I would face by having to deal with a window stack. I'd rather not have to throw my left hand over to the tab key or have to reach for my mouse to shuffle some windows. I've used a multi-monitor configuration with the largest monitors I could afford for over a decade now and don't yet feel this has affected my physiology. But maybe I have a silent killer sneaking up on me?
If you did need to zoom in to keep text legible for eyesight reasons, I'm not sure if the value would be as pronounced. But yes, I do think the clarity and crispness would be compelling enough for me. I've routinely said that if I rest my 300+ DPI cell phone on the surface of one of my Dell 30" U3011 monitors and look at both that and my monitor's display at the same distance (roughly 3 feet), the cell phone looks shockingly more legible and superior in just about every way. I so want to just tug at the edges of my cell phone's screen and make its screen big enough to fill my entire desk surface.
The last computer monitor I bought was the original 24" Apple Cinema Display. I haven't used it for day to day computing in 7 years or so. The reason for that is because I bought a laptop. For the first year or so I plugged the laptop into the display but as I bought newer and newer laptops the difference in resolution got smaller. My current laptop's high DPI screen has almost quadruple the number of pixels as that display.
So, there are 2 reasons that I've not bought a large display.
1) I don't want to need it. If I get hooked on a large display, then I've killed the fundamental utility of the laptop—to be productive anywhere. If I suddenly need a 30" screen to feel as productive as "normal" then my laptop is useless to me. A tethered laptop is not portable any more. I really enjoy the freedom to work on my couch, in an office, coffee shop, or on a whim, out in the backyard on a nice day, without ever feeling like I'm compromised.
2) The current batch of 30" displays are not high DPI. Compared to my latop screen they look pixelated and downright primitive. That seems to be changing with these announced screens and I'm happy about that.
Also, there's technically a #3, which is that anything larger than my 24" monitor will not fit on the desk that I have (it has a self that isn't adjustable or removable). But that's kind of a weak reason and I could probably fix it if I really cared.
However, I do feel that your description hints at the essence of what I was saying earlier: that avoiding a large monitor is somewhat like self-deprivation. You don't want to get spoiled by becoming accustomed to the increased productivity yielded by a large monitor with a large virtual desktop. You're a mobile worker, so that makes a lot of sense. I do most of my work at either my office or at home. So I have two modestly-powerful workstations with lots of monitors at each location (in total, I've spent about twice as much as I would have to buy a high-end laptop).
Because I am not a mobile worker, I have a particularly small laptop (a Surface Pro). I feel crippled when I try to do serious work on my laptop because the screen is so tiny and there's so much context hidden by windows stacked on one another.
In a further-out future, I hope that laptops (or whatever portable device we carry) can one day project a very large, very high-DPI display of their own, giving you and me the best of both worlds.
If I read you correctly, you'd love to have the increased productivity of a larger virtual workspace, but today's technology and your need to be mobile keep you thinking it's best to focus on mobility first. That makes a lot of sense.
I totally agree that the current batch of 30" displays are disappointingly low-resolution. So much agreement on that! Primitive, stale technology, stagnant for 8 years. But I think I've already said enough about that in this thread. :)
Yes. Also, as my laptops' screens have gotten larger, my desire for a bigger screen has gotten lower. My first laptop was 13" and 1280x800 and I tethered to a big screen constantly. It was just too small for me. My current 15" laptop screen (which is 16:10 aspect ratio and high DPI) is good enough for most things. The wide aspect ratio is great for programming, since it allows 2 columns of even very wide code.
I do wish I had one screen dedicated to the app/web page/whatever and one to the debugger (code/vars/console/etc.), but that's really the only time I feel even slightly constrained. And like I said, right now it's good enough that I'm willing to trade that last desire for more portability.
That's a big part of the preference dynamic too. The price per resolution got stuck and didn't budge above 27" for years.
Hopefully one day I'll be happy with four 50" 8K screens.
Curved high dpi 50" please!
Exactly! If I could have all my wish-list items in desktop computing, it would include a large, concave, high-DPI, OLED display with mouse and gesture input.
The "curved" people want is concave screens so that, at the optimal viewing point, all points on the screen are equidistant for the viewer (people with multimonitor setups often approximate this by angling some of the monitors, but concave monitors would avoid the need for this.)
The idea is convex < flat < concave.
Given that I'm not terribly happy about vast screen resolution with existing software, I'd much prefer smaller, higher-PQ, super-high density screens while we as an industry figure out how to make immersive displays usable.
Just MHO, of course.
Plus, if you gave me a 65" 400 DPI screen today, I would drop everything and immediately put it to use. I would set it up on a drafting-style standing desk and would enjoy extremely high-clarity text throughout my desktop user interface. I would hand-craft the desk myself and I'd shower you with thanks and praise.
The point being, I want to be an early adopter and hopefully there would be enough of us for the UI designers to eventually target our demographic. Right now, so much UI R&D is on small form-factor that it's a little heart-breaking to someone like myself who spends 8 hours a day working at a desktop. Science-fiction movies routinely taunt us with immersive desktop computing and we're just barely moving the needle in that direction.
Actually, a simple, tiling window manager (ion3, ratpoison, OSX with SizeUP) does just fine.
I am running triple u3011 monitors with OSX Leopard, running Sizeup, and its basically like having 10 small, no-bezel displays in front of me (left and right screens are split into a quad, and middle screen is split down the middle).
Lose the gratuitous candy-UI elements and it all works just fine :)
Even with no change whatsoever to our current operating systems, my enjoyment of my desktop computer would increase tremendously if I could scrap my multi-monitor setup, replacing it with one high-DPI very large monitor.
But I think the point still stands that interfaces could be improved (and probably quite a bit) to take advantage of this kind of hardware. As dragonwriter said elsewhere in this thread, it may start as just simple things such as providing more flexibility on top of existing quality of life improvements that have been added to window managers based on the recognition that these monitors are larger and you can view more information at once. I would like to see such a monitor coupled with gesture input (along with mouse). I'd love to be able to do gestures to pick up a stack of windows and arrange them in a quick hand motion, for example.
Regarding the raw pixel, a typical 30" has 4.1M pixels, so two have 8.2M pixels and the Seiki has 8.3M which is hardly a difference, your point still stands though.
The lag due to 30hZ is a big drawback, my mousing accuracy is noticeably lower, but I can live with that.
What I cannot live with is the fact that the monitor goes blank and then re-detects the HDMI signal all the time -- perhaps 25 times in the first few minutes and then once every 5-15 minutes afterwards. I'm using the HDMI port on my Intel Haswell board and have updated drivers and the Seiki firmware, and still get this issue. Seiki support is bad, and lots of people have this issue. They specifically released a firmware update that is supposed to resolve this but it doesn't help me. Perhaps another brand of graphics card would help but currently it's broken.
Plenty of free desk space!
(Also, in the pro-sub-30" argument: it's difficult finding affordable movable arms and mounts for 30"+ monitors)
With my main MBP display as a good 2nd screen (great for telecons where you share one screen - you can keep your side-channel chats and email private on the other), one large main screen is perfect.
I've been using 3 very cheap Korean IPS's for over a year now. Play rough with them and they break. Annoying, but acceptable knowing that I was saving a lot of money, getting the best image I'd ever had, and I'd be replacing them in 24 months anyways.
It could be a component that is used by other 4k monitors, but it really seems specific to this Seiki (in varying sizes), as they have released a firmware update that is supposed to help.
It's interesting to me that the blanking events are grouped together and mostly in the beginning. My best guess is that the monitor adjusts some timings and eventually gets it right, and whenever it's turned off it has to figure out the timings all over again. But I don't really know what I'm talking about...
More to the point, I want an immersive, uninterrupted display surface. The bezels in a multi-monitor configuration just serve to annoy me by wasting space on my desk and in my field of view.
"Inadequate" for gaming, yes. Inadequate for computing? Hardly.
But this is why I said that if Seiki dropped an HDMI 2 version of the same display, things get interesting.
How about for watching movies? High framerates would obviously seem very important for gaming, but it's not clear how important they are to movies/shows. Your thoughts?
This isn't perfect, since really each frame should be on the screen for exactly the same amount of time, which is why 120hz TVs came about -- 120 is evenly divisible by each of 60, 30, and 24. But it's also pretty much good enough, or else there'd be a lot more complaining about watching movies on computers or other 60hz devices.
A 30hz screen makes that harder, since you're going to have to repeat every fourth frame once, so the unevenness could be more noticeable since those frames are now twice as long instead of 50% longer. But I haven't tried one to tell you just how noticeable it would be.
Native 60fps content (sports is the most common one I know of) would require throwing away half the frames, but at least the remaining frames would all be shown for the same amount of time.
NTSC "video" content (stuff shot at 30 interlaced frames per second, so 60 fields per second, stuff like old TV shows) would work alright. You wouldn't be able to take advantage of deinterlacing methods that try to reconstruct 60 full frames, but it wouldn't be a huge loss.
Ah, the good ol' days, when blacks were black and "non-native resolutions" weren't a thing. :)
It can be adjusted to make any number of vertical "lines" as it sweeps over the phosphor matrix, but the matrix has a fixed dot-pitch. Because an electron beam is being shot at the matrix from a distance, there's a certain degree of bleed-over to neighboring phosphors. Firing electrons at the phosphors at coordinate 100,200 means a little bit of energy is also received by 99,199 and other neighboring "virtual pixels."
In other words, while the phosphor matrix was just as rigid as the pixel matrix of an LCD, the power source that caused light to be emitted was not perfectly matched 1:1 with the matrix. There isn't an electron gun for each pixel, there is a single electron gun for the whole display.
By comparison, each pixel in an LCD monitor (putting aside the three internal pixels for colors) is an individual mask over a uniformly white back-light. The panel adjusts the opacity of the mask to allow more or less of the white backlight through. The mask over a backlight is what gives LCDs a challenge with viewing angle (because there's a non-zero gap between the light source and the mask and the mask itself has a non-zero depth).
Finally, as I understand it, a plasma television is a bit like both. It's a matrix of phosphors like a CRT, but with individual power sources for each pixel like an LCD. Because light is emitted at the surface, plasmas tend to have very good viewing angles.
OLED then is, in my opinion at least, a spiritual successor to plasma. OLED emits light at its surface like an LCD or plasma, but uses no mask layer over a backlight, like a CRT or plasma.
(Take all of the above with a grain of salt because I'm just a layperson when it comes to hardware.)
I believe there were monitors that went both ways with this, not just letting you run lower than native but also higher than native (I had a 17" CRT that claimed to support 1600x1200, but not without a lot of blurriness, for instance).
edit: man, it's really hard to find specific info about how exactly that worked online these days
I suspect for video from a higher-quality source, my opinion would not be so forgiving.
But put briefly, for consuming crummy web content, I have no complaints.
Don't get me wrong. I haven't yet bought one for myself principally because I am waiting for Seiki (or someone) to offer me a similarly priced 4K monitor with HDMI 2 (60Hz). If I got the current model today and a v2 of the Seiki arrived next month with HDMI 2, I would be sad. The 30Hz limitation is the most significant disadvantage of this display. So I wait.
But if you don't want to wait, the 30Hz limitation is tolerable. At <$500 now, it's very hard to restrain myself from just buying another and being done with the temptation.
More of my crazy thoughts: http://tiamat.tsotech.com/seiki-4k
But a few were. Outdoor pans, landscapes, busy scenes. There's a market shot in the beginning with about 8 mini-scenes in the frame, and you can look at any part of the film and see it far more clearly / less blurring than if it were filmed with any lesser tech.
The shots that didn't work: indoor shots where you could see caked on makeup... it's because no one, literally no one, seems to understand how much less lighting and makeup to use with an HFR camera.
This is a craft problem. We probably won't get widespread HFR until it gets cheap and widespread enough that some people are growing up experimenting with it.
Even with film projection, as long as the higher rate is an integer multiple of the lower, it should be trivial to have the same effect as long as you have camera equipment that lets you shoot at the desired rates.
I'm especially intrigued by those concerning media discs. The next FIFA World Cup is set to broadcast in 8K, so I think one can reasonably predict there'll be movies recorded in 8K by 2018. Now, imagine, something the length of the LoTR trilogy, in 60fps, 3D, and 8K. You're looking at a lot of data right there... that's asking for discs to contain half a petabyte worth of data by around that time. It'll be interesting to see how that plays out.
32" and definitely 24" would be too small for me to read the text. I'd have to hunch forward or blow the font up to read things. I already have that problem on my retina MBP.
My point is: Don't fall into the false idea that a company like Dell is doing something unique coming out of their R&D labs. Consumer displays are commoditized to hell and back. Just look at this weekend's flyers from companies like Walmart: 70 inch full HD TV's for less than $1,000.
1080 displays are commoditized, and have been for some time now.
2560x1600 displays were not commoditized for 8 years. The process has very recently started.
The incumbent brands did not want Ultra HD commoditized so quickly, as evidenced by the $4,000+ prices. Seiki dropped a small bomb on them with the $700 (now $500) 4K televisions. With luck, Ultra HD will be commoditized soon enough. Thank you, Seiki!
Glad the industry is heeding your call to action! :)
I also want deep color to finally come back from the 1990s. I want 300+ DPI in all my displays, everywhere. I want lossy compression to be a thing of the past. I want 10 gigabit symmetric Internet connectivity to my home so I can run a data center in my garage. I want VPNs to evolve into always-on strongly-encrypted private networks that laypeople can use, and do so routinely. I want self-driving flying cars. I want to never have to synchronize anything ever again. I want portable computing devices that are subservient to my central application server and not the plain cloud, and certainly not first-class computers in their own right.
More ranting at .
All told, I think a 50" desktop display is among the least wishful-thinking of the things I want technology to give me in my lifetime.
sudo defaults write \
DisplayResolutionEnabled -bool true
Iris Pro (5200+) is just now the starting point for running a 2.5k resolution (half of 4k) well enough. Anything less still struggles with 2.5k resolutions on a "full OS" like Mac OS/Windows/Linux. So if you want an Intel GPU running it, or anything at that level, you'll need to wait for a version that has 2x the performance of 5200, and then get a 4k monitor.
I've been feeling that way for a long while though, so who knows
Non-Retina MacBooks start at 110 PPI. Cinema Displays have hovered around 100 PPI for about a decade, with the 27” Thunderbolt Display topping out at 109 PPI (same for its sister 27” iMac).
At 28", a 3840 x 2160 panel has a PPI of 157, which, dividing by two to get the equivalent non-Retina PPI, comes to only 78.5, a reduction of over 20% from the lowest-PPI Cinema Displays. This means, on a Mac in Retina mode, everything in the UI will be 20% bigger than normal. Maybe that’s something you could get used to, but it’s far from the much more optimal 24” panel, with a PPI of 184 (92 PPI non-Retina, very close to a 24” Cinema Display).
It's going to be pretty heavy on GPU's mind. The rMBP's do it by rendering a 2x density frame and scaling it down for the display. This means the GPU will have to render a 5120x2880 pixel frame and scale it down to 3840 x 2160.
If you've got a Linux workstation handy, try manually setting your x86 DPI to match your monitor's specs. Everything will look giant because most apps are written assuming DPI is an arbitrary constant (75 or 96), but most people use monitors that are more dense than that.
When you set your DPI to be its actual value, you'll realize that people have become accustomed to screens being more dense over time, but there was certainly a time when DPIs in the 70s were considered par for the course. In fact, many older people prefer less dense pixels because it's easier for them to see bigger things.
Will look exactly like 1080p in "Retina mode" (more accurately, HiDPI). A 28" 1080p monitor isn't that far-fetched.
You're talking about DPI but not how close one sits to a monitor. People sit a lot closer to 13" MacBooks than to 28" 1080p monitors.
The "24" that you mentioned actually seems to also be a 28" screen?
"With the same remarkable, pin-point clarity, the Dell UltraSharp 24 Ultra HD Monitor, users can enjoy color consistency and precision from virtually any angle thanks to an ultra-wide viewing angle on a 28.3-inch screen. "
I think someone just had a minor dyslexic moment
It doesn't have to be this way.
(UltraSharp): The high-end moniker
Ultra HD: The name for that resolution
Monitor: What it is
Maybe it's because I'm an engineer and not a normal person, but the naming scheme seems quite reasonable. Way better than the utterly indecipherable "Dell 324y5943X" crap you get with some other manufacturers. The resolution name, "Ultra HD", is stupid, but that wasn't Dell's decision to make, and the industry has been naming resolutions like that for twenty years. (VGA, SVGA, XGA, SXGA, etc...)
Also, how about just a simplified product line like, I don't know:
Dell UltraHD 24"
Dell UltraHD 32"
Along the lines of Cinema Display.
Anyway, as for unifying the product line- Dell being Dell, that would mean simply cutting the UltraSharp product line, which I would consider a loss.
It is unfortunate for dell that the 2160p resolution got labeled something so similar to their existing brand.
These names have been with us for, what, nearly a decade?
EDIT: Also, not nearly all cheap "HD" TVs actually have real 1080p panels. They accept a HD signal but internally resample it to a lower resolution, leading to an image that's obviously blurry in desktop use. With computer monitors, you know you get pixel-for-pixel the image the video card pushes out.
I find TV's are bulkier, run hotter, have poor refresh rates, worse viewing angles, and look pretty poor up close. None of this matters when you use it as a TV, of course.
They also usually have massive bezels, harsher lighting, and come on fixed stands (no rotation or vertical adjustment, often not even any pivoting up/down).
You totally could use a TV as a monitor if you wanted. I personally stick with monitors for my desk, TVs for my wall.
It'll drive it @ 30Hz via HDMI as is today. The hardware also supports 60Hz via DisplayPort 1.2 through the thunderbolt ports and I've heard it works in windows but I don't believe Mavericks has driver support yet.
Can't wait until 8k monitors come about! These can be both truly ultra sharp, with invisible pixels, and sizeable.
But still... yay for finally reaching retina-like resolutions on the desktop! It was about the time.
edit: the tech specs page convince me the name is not ridiculously misleading. Seriously interested of this one.
Diagonally Viewable Size:
23.8" (23.8-inch wide viewable image size)
Is it that you're holding out for an 8K desktop monitor?
Comment sections like this are why HN is called an echo chamber.
A clearer chart is at , and the Carlton Bale's original article even has a handy calculator that lets you verify this .
I'm not at 2 feet with a smaller dual display, and I'd imagine I'd sit further away with gigantic 30+ monitor(s). I know I do when I use a 27. Others at $work mostly sit at least as far away.
In this area of improved vision, what does it gain you? Did all the old hackers not amount to what they could have been because they perceived some less pixels than there could have been?
I don't think a serious argument can be made on this this front until standards reach at least 16k.
Incidentally, given that displays seem to be continually trending larger, I expect that we'll eventually have an entire wall in our homes that's a display. 4k is just another step toward that.
but hey if you sit < 2' from a 28" and have no problems with that, then go out and grab you one