Hacker News new | comments | show | ask | jobs | submit login
Dell introduces 28" 4K Ultra HD Monitor for under $1K, shipping early 2014 (dell.com)
241 points by hemancuso on Dec 2, 2013 | hide | past | web | favorite | 206 comments



Awesome. I say this is a sign that the incumbent manufacturers are finally adapting to the heat brought on by Korean and Chinese newcomers.

"Industry's most affordable Ultra HD monitor" is semantics. The Seiki 39" is still substantially cheaper, now at less than $500 on Amazon. Technically, that is a television, so I suppose "monitor" is a clever use of words by Dell.

If Seiki drops an HDMI 2 version of their monitor—excuse me, television—for roughly the same price as the current model, things start to get really interesting. Actually, scratch that, the desktop monitor industry is finally interesting again right now. This is the first time it has been interesting since about 2005.

I prefer the higher DPI that Dell is providing, but I prefer the size that Seiki is going after. My ideal is 200+ DPI at about 50" for my desktop. All steps toward that make me happy, so good going Dell!


50" for my desktop.

What? Most people I know can't even handle the current 30" monitors. It feels nicer having two 27" monitors from a not-grossly-exceeding-peripheral-vision viewpoint.


Like jfb says, different strokes. I use multiple 30" monitors on my two workstations and I would prefer a single 50" display. My wife's Seiki has more raw pixels than two of my 30" monitors. I admit that an ideal 50" display would be OLED and concave, but I'll keep those off my wish-list for the time being. I don't want the manufacturers to think I'm being unreasonable. High-DPI and large form-factor seem reasonable for 2014.

I don't like the bezels between monitors in a multi-monitor configuration. I don't like the low-DPI of my 30" monitors. And I certainly don't like the price of the Dell U3011.

When you say you know people who can't handle a 30" monitor, do you mean that? They would refuse to use a 30" even if they were given one? I've only met a few people who are of that particular disposition, and I personally consider them odd. It's a weird self-flagellation or quasi-Luddism.

In my experience, most people don't deal with 30" monitors because they have been effectively fixed at over $1,100 for the past 8 years, only recently being disrupted by the Koreans and Chinese. From that perspective, where a small 1080 monitor can be had for a couple hundred bucks, it required a special type of enthusiast to pay the steep premium for 30" displays. But it looks that we're finally getting some changes in desktop displays, and I consider this positive progress toward what I would consider an ideal. To reiterate what I've said elsewhere, I want desktop computing to use nearly-immersive display technology, and I think this is more realistic in the mid-term than VR goggles.

But to be absolutely clear, I consider 30" too small for desktop computing today.


> When you say you know people who can't handle a 30" monitor, do you mean that? They would refuse to use a 30" even if they were given one? I've only met a few people who are of that particular disposition, and I personally consider them odd. It's a weird self-flagellation or quasi-Luddism.

Not at all. It's practicality. Close enough to read normal-sized text clearly, and the edges fade into peripheral vision; far enough that I can see the whole thing at a glance, and I start having to zoom to read. I'd use one if it was the only monitor available, but I'd prefer a 24".

If it works for you, that's nice. I'm not calling you a freak of nature, so please don't call me a self-flagellating Luddite.


Very well. I'm only describing the people I know first-hand whom have said they are perfectly happy with small monitors. They are the same people who previously told me that TN LCDs were perfectly fine, and were content with convex CRTs before that. They didn't understand why I would complain about the low density iPad 2 LCD and didn't appreciate the appeal of high-DPI until they had a iPad 3 (or similar) in their hand.

I am emphatically against the notion of "good enough," and those whom I personally have heard express confusion about the appeal of large monitors have routinely described various technology levels as "good enough."

I don't mean to suggest that everyone who would refuse a large display is definitively a Luddite. Furthermore, I am abusing the definition of Luddite here to mean someone who is satisfied with the status quo.

That said, I cannot relate to the appeal of small monitors whatsoever. It would frustrate me greatly to use a 24" LCD for a whole day of work. All the alt-tabbing, control-tabbing, the loss of context, and window shuffling would get old, quick.

Incidentally, for me, I'd have to sit very close to a 24" LCD to have its edges fade into peripheral vision. I like having my monitors about 2.5 to 3 feet from my eyes. A monitor would need to be 50" or larger for its edges to fade into peripheral vision at ~3 feet.

Something else I really enjoy about large monitors is the ability to zoom in and then sit back further (4 or 5 feet) and relax. I like the flexibility of varying the viewing distance nicely between 1.5 to 5 feet by leaning in or leaning back depending on use-case and my feeling at the moment.


I'm not a fan of alt-tabbing either. I use 4 20" displays currently because I like the convenience of maximizing something on one display. Don't you miss that with one big desktop?


> I'm not a fan of alt-tabbing either. I use 4 20" displays currently because I like the convenience of maximizing something on one display. Don't you miss that with one big desktop?

There's no fundamental reason you would have to -- if you have a window manager that extends the "drag to the side to half-maximize" behavior that Windows 7 has to, say, include dragging to the corner to "quarter-maximize" (ideally, with four appropriate keyboard shortcuts, one per corner, so you could do it without dragging, as well.)


Tangentially related, for anyone looking for this functionality on the Mac, do yourself a favour and get Cinch.

http://www.irradiatedsoftware.com/cinch/


Better Touch Tool also has that functionality: http://boastr.de/

I primarily use it to map three finger click to middle mouse button, but when I discovered it also had the window-sizing-by-dragging-to-the-edge feature I turned it on and have been very happy with it..


Divvy is also widely used.

http://mizage.com/divvy/


Or the extremely configurable Slate: https://github.com/jigish/slate


I use Spectacle because it's free and I hate having to pay for each feature on OS X that windows has by default. I miss Aero Snap so much, but Spectacle does a decent-enough job.

http://spectacleapp.com/


Forgive me for not understanding, but how does having multiple large monitors help avoid alt-tabbing?

Seems to me that if you have several open apps then to type into one of them you have to either move your hand to the mouse and click it or you have to alt-tab. Even if you have "focus follows mouse" then you still have to move the mouse over to the window before you can type…


topbanana already provided the key point in his response—visibility of a great deal of context. It's especially useful for programming, allowing for, for example, related code side-by-side. A large desktop also allows some user interface paradigms such as drag-and-drop to work more elegantly. Drag-and-drop on a small monitor is often frustrated by the need to first arrange the source and destination so that both are visible at the same time. Generally speaking, a large amount of desktop space allows you to spend less time fussing with windows to get at what you need and more time just working.


> Drag-and-drop on a small monitor is often frustrated by the need to first arrange the source and destination so that both are visible at the same time.

At least in Windows and KWin, drag to the destination window's icon in the task bar and it should come to the front... But yes, still not as convenient.


It's more for display than data entry. For example, when I'm debugging I like to be able to see the local variables, call stack, threads, modules, log entries and of course the code, without switching tabs or bringing anything to the front. But I only really need to manipulate the code.


> When you say you know people who can't handle a 30" monitor, do you mean that? They would refuse to use a 30" even if they were given one?

Yes. There are a few issues at play:

1. I want more screen space so I want a bigger monitor

2. My desk has a finite depth. For better posture the monitor should be far enough away that I don't need to move my head to see from a corner to any other corner

Multi-monitor set ups are probably worse, my physio sees a number of ppl who use them. But any big screen will have the same effect, encouraging neck and spinal issues.

I'm curious - if you need to put a big monitor further away and increase the font size to keep it legible, don't you lose most of the benefits of "bigger screen area" and gain only crisp rendering and clarity?


Your physiological perspective makes sense. I don't know a whole lot about posture and physiology.

For my multiple 30" monitors, I keep them each about 3 feet from my eyes, more than an arm's length away. I suppose I have been blessed with fairly good eyesight because I don't zoom the view, instead keeping everything at 100% for most day-to-day usage. If I'm feeling like I want to relax, I'll zoom things in and recline a bit.

I guess from my layperson perspective, I'd much rather move my eyes and head than deal with the contextual and organizational problems I would face by having to deal with a window stack. I'd rather not have to throw my left hand over to the tab key or have to reach for my mouse to shuffle some windows. I've used a multi-monitor configuration with the largest monitors I could afford for over a decade now and don't yet feel this has affected my physiology. But maybe I have a silent killer sneaking up on me?

If you did need to zoom in to keep text legible for eyesight reasons, I'm not sure if the value would be as pronounced. But yes, I do think the clarity and crispness would be compelling enough for me. I've routinely said that if I rest my 300+ DPI cell phone on the surface of one of my Dell 30" U3011 monitors and look at both that and my monitor's display at the same distance (roughly 3 feet), the cell phone looks shockingly more legible and superior in just about every way. I so want to just tug at the edges of my cell phone's screen and make its screen big enough to fill my entire desk surface.


> When you say you know people who can't handle a 30" monitor, do you mean that? They would refuse to use a 30" even if they were given one? I've only met a few people who are of that particular disposition, and I personally consider them odd. It's a weird self-flagellation or quasi-Luddism.

The last computer monitor I bought was the original 24" Apple Cinema Display. I haven't used it for day to day computing in 7 years or so. The reason for that is because I bought a laptop. For the first year or so I plugged the laptop into the display but as I bought newer and newer laptops the difference in resolution got smaller. My current laptop's high DPI screen has almost quadruple the number of pixels as that display.

So, there are 2 reasons that I've not bought a large display.

1) I don't want to need it. If I get hooked on a large display, then I've killed the fundamental utility of the laptop—to be productive anywhere. If I suddenly need a 30" screen to feel as productive as "normal" then my laptop is useless to me. A tethered laptop is not portable any more. I really enjoy the freedom to work on my couch, in an office, coffee shop, or on a whim, out in the backyard on a nice day, without ever feeling like I'm compromised.

2) The current batch of 30" displays are not high DPI. Compared to my latop screen they look pixelated and downright primitive. That seems to be changing with these announced screens and I'm happy about that.

Also, there's technically a #3, which is that anything larger than my 24" monitor will not fit on the desk that I have (it has a self that isn't adjustable or removable). But that's kind of a weak reason and I could probably fix it if I really cared.


I don't mean to criticize anything about your line of thinking because it sounds like you're doing what makes sense to you and you're happy. So who am I to tell you you're wrong?

However, I do feel that your description hints at the essence of what I was saying earlier: that avoiding a large monitor is somewhat like self-deprivation. You don't want to get spoiled by becoming accustomed to the increased productivity yielded by a large monitor with a large virtual desktop. You're a mobile worker, so that makes a lot of sense. I do most of my work at either my office or at home. So I have two modestly-powerful workstations with lots of monitors at each location (in total, I've spent about twice as much as I would have to buy a high-end laptop).

Because I am not a mobile worker, I have a particularly small laptop (a Surface Pro). I feel crippled when I try to do serious work on my laptop because the screen is so tiny and there's so much context hidden by windows stacked on one another.

In a further-out future, I hope that laptops (or whatever portable device we carry) can one day project a very large, very high-DPI display of their own, giving you and me the best of both worlds.

If I read you correctly, you'd love to have the increased productivity of a larger virtual workspace, but today's technology and your need to be mobile keep you thinking it's best to focus on mobility first. That makes a lot of sense.

I totally agree that the current batch of 30" displays are disappointingly low-resolution. So much agreement on that! Primitive, stale technology, stagnant for 8 years. But I think I've already said enough about that in this thread. :)


> If I read you correctly, you'd love to have the increased productivity of a larger virtual workspace, but today's technology and your need to be mobile keep you thinking it's best to focus on mobility first. That makes a lot of sense.

Yes. Also, as my laptops' screens have gotten larger, my desire for a bigger screen has gotten lower. My first laptop was 13" and 1280x800 and I tethered to a big screen constantly. It was just too small for me. My current 15" laptop screen (which is 16:10 aspect ratio and high DPI) is good enough for most things. The wide aspect ratio is great for programming, since it allows 2 columns of even very wide code.

I do wish I had one screen dedicated to the app/web page/whatever and one to the debugger (code/vars/console/etc.), but that's really the only time I feel even slightly constrained. And like I said, right now it's good enough that I'm willing to trade that last desire for more portability.


they have been effectively fixed

That's a big part of the preference dynamic too. The price per resolution got stuck and didn't budge above 27" for years.

Hopefully one day I'll be happy with four 50" 8K screens.


I hear you on the bezels, however I like the wrap around effect multiple portrait monitors give over 1 big flat monitor

Curved high dpi 50" please!


> Curved high dpi 50" please!

Exactly! If I could have all my wish-list items in desktop computing, it would include a large, concave, high-DPI, OLED display with mouse and gesture input.


I don't get the whole "curved" thing. I quite enjoyed getting rid of "curved" displays with the switch to LCDs.


The "curved" that we got rid of was convex CRTs.

The "curved" people want is concave screens so that, at the optimal viewing point, all points on the screen are equidistant for the viewer (people with multimonitor setups often approximate this by angling some of the monitors, but concave monitors would avoid the need for this.)

The idea is convex < flat < concave.


I think current interfaces would fall over in a truly immersive environment; I find it too damn hard to keep track of all my windows and modals and popups and shit with 2x27"; if I was offered tomorrow a 65", 400ppi screen, I'd have a hard time setting up my work environment to match.

Given that I'm not terribly happy about vast screen resolution with existing software, I'd much prefer smaller, higher-PQ, super-high density screens while we as an industry figure out how to make immersive displays usable.

Just MHO, of course.


You are right, of course. Current interfaces are not well suited to high-DPI large form-factor displays. But if we wait for interfaces to happen first, I worry the wait will be prolonged.

Plus, if you gave me a 65" 400 DPI screen today, I would drop everything and immediately put it to use. I would set it up on a drafting-style standing desk and would enjoy extremely high-clarity text throughout my desktop user interface. I would hand-craft the desk myself and I'd shower you with thanks and praise.

The point being, I want to be an early adopter and hopefully there would be enough of us for the UI designers to eventually target our demographic. Right now, so much UI R&D is on small form-factor that it's a little heart-breaking to someone like myself who spends 8 hours a day working at a desktop. Science-fiction movies routinely taunt us with immersive desktop computing and we're just barely moving the needle in that direction.


"Current interfaces are not well suited to high-DPI large form-factor displays."

Actually, a simple, tiling window manager (ion3, ratpoison, OSX with SizeUP) does just fine.

I am running triple u3011 monitors with OSX Leopard, running Sizeup, and its basically like having 10 small, no-bezel displays in front of me (left and right screens are split into a quad, and middle screen is split down the middle).

Lose the gratuitous candy-UI elements and it all works just fine :)


I agree with you as well, which is why elsewhere I said I would definitely take a large high-DPI display and make it work.

Even with no change whatsoever to our current operating systems, my enjoyment of my desktop computer would increase tremendously if I could scrap my multi-monitor setup, replacing it with one high-DPI very large monitor.

But I think the point still stands that interfaces could be improved (and probably quite a bit) to take advantage of this kind of hardware. As dragonwriter said elsewhere in this thread, it may start as just simple things such as providing more flexibility on top of existing quality of life improvements that have been added to window managers based on the recognition that these monitors are larger and you can view more information at once. I would like to see such a monitor coupled with gesture input (along with mouse). I'd love to be able to do gestures to pick up a stack of windows and arrange them in a quick hand motion, for example.


I'd drop everything for a 65" 400ppi display, of course, but I'd take a 24" 400ppi over a 65" 200ppi without even thinking about it.


Why? Just position the 65" one twice as far away and you have effectively the same pixel density with a bigger effective viewing area and less eye strain.


My office is small? Good point, though.


I'll take both, several of each!


Single High DPI 50" screen all the way! I hate the bezels between my monitors and the low DPI. With a monitor like that you could move not important stuff more to the top ("like logging output") and have the development tools/browser in the lower area, cant wait :)

Regarding the raw pixel, a typical 30" has 4.1M pixels, so two have 8.2M pixels and the Seiki has 8.3M which is hardly a difference, your point still stands though.


I own the 50" Seiki and I use it as a desktop monitor. I need to modify my desk so I can move it further back but I'm getting used to the size.

The lag due to 30hZ is a big drawback, my mousing accuracy is noticeably lower, but I can live with that.

What I cannot live with is the fact that the monitor goes blank and then re-detects the HDMI signal all the time -- perhaps 25 times in the first few minutes and then once every 5-15 minutes afterwards. I'm using the HDMI port on my Intel Haswell board and have updated drivers and the Seiki firmware, and still get this issue. Seiki support is bad, and lots of people have this issue. They specifically released a firmware update that is supposed to resolve this but it doesn't help me. Perhaps another brand of graphics card would help but currently it's broken.


If you need to move it further back you are essentially shrinking your monitor size and you could have instead purchased a smaller monitor and kept the same distance


It doesn't feel the same. Unless one is nearsighted and/or crosseyed, it's easier to focus and converge on a more distant object. It also provides for a lot more desk space in front of the monitor.


I'm holding up 27" monitors with this cheap little thing: http://www.monoprice.com/Product?c_id=109&cp_id=10828&cs_id=...

Plenty of free desk space!

(Also, in the pro-sub-30" argument: it's difficult finding affordable movable arms and mounts for 30"+ monitors)


I have 2 24" monitors at work in portrait mode and bezels are my bane. I would much prefer a 30" (or hell, 50") with no bezel.

With my main MBP display as a good 2nd screen (great for telecons where you share one screen - you can keep your side-channel chats and email private on the other), one large main screen is perfect.


Interesting. I am crosseyed and I never paid attention to how it plays out with my monitor preferences.


For mousing accuracy, check out http://smoothmouse.com/ It changes the way the mouse is buffered and was recommended to me with my 39" Seiki.


The HDMI re-detection flaw happens on my wife's 39" Seiki on HDMI 1 and 3 as well. But using the HDMI 2 input, that does not happen. If you haven't already done so, try using the other two HDMI inputs to see if the problem goes away.


Is this a driver problem of a QA issue?

I've been using 3 very cheap Korean IPS's for over a year now. Play rough with them and they break. Annoying, but acceptable knowing that I was saving a lot of money, getting the best image I'd ever had, and I'd be replacing them in 24 months anyways.


I've read that people are having this issue with this monitor with all kinds of different playback hardware, including a Blu-Ray player that upconverts to 4k.

It could be a component that is used by other 4k monitors, but it really seems specific to this Seiki (in varying sizes), as they have released a firmware update that is supposed to help.


Maybe replace the cable? I had a monitor that did the same thing over DVI, blanking out every few minutes. Replaced the cheap pack-in DVI cable with a better one and then it was fine.


That was my first thought as well. I tried a $100 Blue Jeans HDMI cable we had here at the office but there was no change.

It's interesting to me that the blanking events are grouped together and mostly in the beginning. My best guess is that the monitor adjusts some timings and eventually gets it right, and whenever it's turned off it has to figure out the timings all over again. But I don't really know what I'm talking about...


I like my 30", and wish for more (4 side-by-side windows!). I suspect people who don't like big monitors tend to make their windows fullscreen instead of keeping them content-sized. If you think in windows instead of maximized screens, 50" is superior to 54" split in two. It comes down to the artificial split down the middle which limits where you can stick your windows.


The artificial split limit is a benefit for me. Helps self organize things. Watch a movie on the left screen, web browse on the right screen. Keynote in the left screen, reference materials in the right screen, etc.


I find that a single big monitor with a tiling window manager gives the same benefit but is more flexible.


Actually, I would not want to fully-maximize much on a 50+" desktop display, except when I lapse into a pure-consumption mental state.

More to the point, I want an immersive, uninterrupted display surface. The bezels in a multi-monitor configuration just serve to annoy me by wasting space on my desk and in my field of view.


Different strokes.


But the Seiki 39" TV has its framerate capped at 30Hz while on 4k resolution, which makes it inadequate as a monitor.


Yes, I know. My wife is using the 39" and it's fine for her computer usage. She doesn't play video games on her computer.

"Inadequate" for gaming, yes. Inadequate for computing? Hardly.

But this is why I said that if Seiki dropped an HDMI 2 version of the same display, things get interesting.


> "Inadequate" for gaming, yes. Inadequate for computing? Hardly.

How about for watching movies? High framerates would obviously seem very important for gaming, but it's not clear how important they are to movies/shows. Your thoughts?


Displaying 24fps content on a 60fps screen repeats frames. If you repeat half the frames of the 25fps stream twice, and alternating half of the frames three times, then you get 60 frames per second, with a stream looking like:

AABBBCCDDDEEFFF...

This isn't perfect, since really each frame should be on the screen for exactly the same amount of time, which is why 120hz TVs came about -- 120 is evenly divisible by each of 60, 30, and 24. But it's also pretty much good enough, or else there'd be a lot more complaining about watching movies on computers or other 60hz devices.

A 30hz screen makes that harder, since you're going to have to repeat every fourth frame once, so the unevenness could be more noticeable since those frames are now twice as long instead of 50% longer. But I haven't tried one to tell you just how noticeable it would be.

Native 60fps content (sports is the most common one I know of) would require throwing away half the frames, but at least the remaining frames would all be shown for the same amount of time.

NTSC "video" content (stuff shot at 30 interlaced frames per second, so 60 fields per second, stuff like old TV shows) would work alright. You wouldn't be able to take advantage of deinterlacing methods that try to reconstruct 60 full frames, but it wouldn't be a huge loss.


Drop the screen refresh rate to 24hz to match the video then...


That was an option on CRTs since you could control the speed of the beam, but I'm not aware of any LCDs that have an adjustable native refresh rate.

Ah, the good ol' days, when blacks were black and "non-native resolutions" weren't a thing. :)


Actually in this case it would work fine. The Seiki's panel is actually 120hz, the 30hz limit is on the input side. Send it a 24hz HDMI signal and it can easily match that up to the panel's actual 120hz refresh rate by simply showing each input frame for 5 panel refreshes instead of the 4 refreshes it displays the 30hz input at. The Seiki can also take 1080p input at 60hz, which would be a reasonable fallback for gaming. Run 4k 30hz for desktop usage, and 1080p 60hz for gaming.


That's one thing I never understood. With CRT's, you still had discrete pixels (either clusters of 3 RGB dots, or 3 RGB bars). But no matter the resolution (within the CRT's limits), nothing looked blotchy. Whereas on an LCD, you can always tell when it isn't running in a native resolution (or even divisor). So what is the difference? Other than a CRT would naturally do analog interpolation, couldn't an LCD or graphics adapter be programmed to simulate the same effect?


I am not a hardware guy, but from my understanding a CRT uses an electron gun a distance from the screen to cause a matrix of phosphors to emit light. Hence the depth of CRTs. The electron gun can repeatedly scans from top to bottom, left to right (or whatever direction).

It can be adjusted to make any number of vertical "lines" as it sweeps over the phosphor matrix, but the matrix has a fixed dot-pitch. Because an electron beam is being shot at the matrix from a distance, there's a certain degree of bleed-over to neighboring phosphors. Firing electrons at the phosphors at coordinate 100,200 means a little bit of energy is also received by 99,199 and other neighboring "virtual pixels."

In other words, while the phosphor matrix was just as rigid as the pixel matrix of an LCD, the power source that caused light to be emitted was not perfectly matched 1:1 with the matrix. There isn't an electron gun for each pixel, there is a single electron gun for the whole display.

By comparison, each pixel in an LCD monitor (putting aside the three internal pixels for colors) is an individual mask over a uniformly white back-light. The panel adjusts the opacity of the mask to allow more or less of the white backlight through. The mask over a backlight is what gives LCDs a challenge with viewing angle (because there's a non-zero gap between the light source and the mask and the mask itself has a non-zero depth).

Finally, as I understand it, a plasma television is a bit like both. It's a matrix of phosphors like a CRT, but with individual power sources for each pixel like an LCD. Because light is emitted at the surface, plasmas tend to have very good viewing angles.

OLED then is, in my opinion at least, a spiritual successor to plasma. OLED emits light at its surface like an LCD or plasma, but uses no mask layer over a backlight, like a CRT or plasma.

(Take all of the above with a grain of salt because I'm just a layperson when it comes to hardware.)


The hardware side is where things start getting out of my area of knowledge, but my understanding is that yes, CRTs had a native upper bound on their resolution which had to do with the dot pitch of the display. I would guess that the way this worked in practice is that for lower resolutions they were effectively partially lighting up multiple phosphors per pixel on each pass? I imagine the difference between this an an LCD is that the amount of time a single pixel was lit by the beam on a CRT is far, far lower than the refresh rate of an LCD.

I believe there were monitors that went both ways with this, not just letting you run lower than native but also higher than native (I had a 17" CRT that claimed to support 1600x1200, but not without a lot of blurriness, for instance).

edit: man, it's really hard to find specific info about how exactly that worked online these days


So how does it work with games where the FPS is variable? Shouldn't the mismatch in FPS and HZ create a lot of problems?


Is this also the source of the ultra-real "soap opera" effect that ruins movies on new televisions?


No. That's the TV interpolating frames. The effect described would introduce a stutter.


Ugh, I hate that effect. The frame rate dynamically changes based on when the algorithm can find frames to interpolate. It's so disorienting.


That effect can often be turned off. I know it can on my 2008-era samsung tv.


Admittedly, we've not yet watched a film on the Seiki. We have fired up a few Youtube videos at full-screen and aside from looking comical (due to the large lossy compression artifacts present even in Youtube's highest quality videos), the video seems as smooth as Youtube ever does, at least to my eye.

I suspect for video from a higher-quality source, my opinion would not be so forgiving.

But put briefly, for consuming crummy web content, I have no complaints.

Don't get me wrong. I haven't yet bought one for myself principally because I am waiting for Seiki (or someone) to offer me a similarly priced 4K monitor with HDMI 2 (60Hz). If I got the current model today and a v2 of the Seiki arrived next month with HDMI 2, I would be sad. The 30Hz limitation is the most significant disadvantage of this display. So I wait.

But if you don't want to wait, the 30Hz limitation is tolerable. At <$500 now, it's very hard to restrain myself from just buying another and being done with the temptation.

More of my crazy thoughts: http://tiamat.tsotech.com/seiki-4k


I don't understand why you would want HDMI 2. DisplayPort is pretty much better in every single way.


DisplayPort is indeed superior and I'd gladly take a cheap UHD monitor with both HDMI 2 and DisplayPort.


YouTube is 30fps, so it should match up with vsync nicely.


I wouldn't want to watch a movie at 30hz, or at 4k.


Aren't all movies at 30Hz or 24Hz? I don't think there are any mainstream releases at a higher FPS...


The Hobbit [1] is the only movie I can think of that is shot at a higher frame rate, it being shot at 48 fps.

[1] http://www.fastcocreate.com/1682085/will-the-hobbit-start-a-...


There are some noises about higher framerate movie production; The Hobbit was the first, and it was not well received. Motion at 30fps and higher looks very different than motion at 24fps, and part of the resistance to HFR is simply unfamiliarity; however, a lot of shots in The Hobbit were not improved by the 48fps presentation; I think that we won't get good results until film projection is dead, and directors and DPs can present at arbitrary frame rates -- there's no reason that, given a digital projection, a movie couldn't use 48 (or 60!) for certain sequences, while retaining 24 (or 30) for others. It would add another technique for directors to use, just as video has.


> a lot of shots in The Hobbit were not improved by the 48fps presentation

But a few were. Outdoor pans, landscapes, busy scenes. There's a market shot in the beginning with about 8 mini-scenes in the frame, and you can look at any part of the film and see it far more clearly / less blurring than if it were filmed with any lesser tech.

The shots that didn't work: indoor shots where you could see caked on makeup... it's because no one, literally no one, seems to understand how much less lighting and makeup to use with an HFR camera.

This is a craft problem. We probably won't get widespread HFR until it gets cheap and widespread enough that some people are growing up experimenting with it.


Yeah - the outdoor scenes were magnificent, the indoor ones you could suddenly tell that it was being lit with big electric lamps, rather than natural light.


Agreed on all counts.


> I think that we won't get good results until film projection is dead, and directors and DPs can present at arbitrary frame rates -- there's no reason that, given a digital projection, a movie couldn't use 48 (or 60!) for certain sequences, while retaining 24 (or 30) for others.

Even with film projection, as long as the higher rate is an integer multiple of the lower, it should be trivial to have the same effect as long as you have camera equipment that lets you shoot at the desired rates.


Yeah, I guess, although it makes for the possibility of broken cadences, which is everybody's least favorite thing ever. I imagine that this isn't going to happen until film is dead as a presentation medium for a variety of reasons (tooling, as you note).


It might just be the unfamiliarity, but I have a feeling that a benefit of 24fps is that while it loses data, your mind fills in the details to make it more believable. At higher framerates I have a subtle feeling that the acting seems more fake, maybe because the higher detail of the framerate highlights the mistakes of the actors more, or with CGI it highlights the unnatural movements of the computer animation.


Very cool. I don't think it'll be distributable at 48fps though, at least not via blu-ray, which seems capped at 30fps by spec.


Current direction of cinema trends raises lots of interesting technological questions.

I'm especially intrigued by those concerning media discs. The next FIFA World Cup is set to broadcast in 8K, so I think one can reasonably predict there'll be movies recorded in 8K by 2018. Now, imagine, something the length of the LoTR trilogy, in 60fps, 3D, and 8K. You're looking at a lot of data right there... that's asking for discs to contain half a petabyte worth of data by around that time. It'll be interesting to see how that plays out.


I've had to use a 30hz framerate on my 27" Dell in order to be able to drive it over HDMI. Since I am mainly editing and reading text, which is mostly static, the low frame rate is not so bad. I wouldn't want it for gaming, but that 39" TV sounds extremely nice for coding.


I just got the Seiki 39" as well. It works great for coding and browsing the web. Even watching movies. However the color fidelity is absolutely crap. Everything looks oversaturated and blown out. I have an apple cinema I switch to whenever I want to do any of my photography stuff. The 30fps is noticeable but it doesn't feel encumbering. But I'm not doing any arcade style gaming.

32" and definitely 24" would be too small for me to read the text. I'd have to hunch forward or blow the font up to read things. I already have that problem on my retina MBP.


How does it fair for software development or email? Is it inadequate even for those tasks?


fare


It's likely easier for Seiki to move up than it is for Dell/Apple/&c. to move down.


It's also too big.


You do realize that the guts in all of these monitors are either Korean or Chinese, right? All display manufacturers feed off of just a handful --quite literally a handful-- of OEM panel manufacturers. There is no real special sauce except in cases where an OEM, like Apple, pays for a run on a certain spec panel in exchange for exclusivity for a period of time and a guaranteed purchase of an agreed-upon volume of panels. After that the panels become available to anyone who wants to use them.

My point is: Don't fall into the false idea that a company like Dell is doing something unique coming out of their R&D labs. Consumer displays are commoditized to hell and back. Just look at this weekend's flyers from companies like Walmart: 70 inch full HD TV's for less than $1,000.


Fine, replace "manufacturer" in my original message with "brand."

1080 displays are commoditized, and have been for some time now.

2560x1600 displays were not commoditized for 8 years. The process has very recently started.

The incumbent brands did not want Ultra HD commoditized so quickly, as evidenced by the $4,000+ prices. Seiki dropped a small bomb on them with the $700 (now $500) 4K televisions. With luck, Ultra HD will be commoditized soon enough. Thank you, Seiki!


Of course, without those $4,000 TVs it's likely that Seiki would have no cheap reject panels to sell.


Yes, perhaps so! Thanks to the early adopters who have made these things affordable to cheapskates such as myself.


When I saw this title, I knew you'd be at the top of the comments.

Glad the industry is heeding your call to action! :)


Ha! :) I wish I had that kind of influence.


I really dislike how scrolling and other motions like window movement look at 30Hz. I can't be the only one.


It probably has more to do with being fined for price fixing LCD panels and the knock on effects from that.


you want a 50 inch screen as your desktop monitor? Good god.


Indeed!

I also want deep color to finally come back from the 1990s. I want 300+ DPI in all my displays, everywhere. I want lossy compression to be a thing of the past. I want 10 gigabit symmetric Internet connectivity to my home so I can run a data center in my garage. I want VPNs to evolve into always-on strongly-encrypted private networks that laypeople can use, and do so routinely. I want self-driving flying cars. I want to never have to synchronize anything ever again. I want portable computing devices that are subservient to my central application server and not the plain cloud, and certainly not first-class computers in their own right.

More ranting at [1].

All told, I think a 50" desktop display is among the least wishful-thinking of the things I want technology to give me in my lifetime.

[1] http://tiamat.tsotech.com/technology-sucks


So a quad 24 Ultra HD setup would be just right..


That would be quite agreeable, yes! The bezels would then be the target for elimination by the next generation.


Does anyone know, are there any Macs that can drive this in Retina mode? Does OSX on Mac desktops support retina mode for arbitrary monitors, or does OSX tie Retina mode exclusively to Macbook Pro Retina screens?


My 27" iMac can run in 1280x720 "HiDPI Mode", which is just the native resolution of 2560x1440 cut in half (quarters?). It's a native feature in OS X, although it has to be enabled manually before it shows up.


  sudo defaults write \
    /Library/Preferences/com.apple.windowserver.plist \
    DisplayResolutionEnabled -bool true
... and then reboot. I run my main screen (a 27" iMac) at native and my secondary (a 27" Thunderbolt display) at ¼ res HiDPI. It's a good compromise.


This doesn't work in Mavericks (on external monitors), and as far as I know, there is no other method that works.


It's working for me right now. Hmm. I wonder if there's some other default I've poked?


Sorry, edited. The issue is with external monitors.


Again, I have it working. Did you reboot? Maybe it's a video card thing?


Running it well? No, with the one exception of the Mac Pro, which can supports 3 such monitors at once, but it has a GPU vastly more powerful than any Intel GPU inside the Macbooks.

Iris Pro (5200+) is just now the starting point for running a 2.5k resolution (half of 4k) well enough. Anything less still struggles with 2.5k resolutions on a "full OS" like Mac OS/Windows/Linux. So if you want an Intel GPU running it, or anything at that level, you'll need to wait for a version that has 2x the performance of 5200, and then get a 4k monitor.


This macbook pro spec page says hdmi port can output 4k at 24hz. http://www.apple.com/uk/macbook-pro/specs-retina/


I own the Seiki 4K 50" The launch version of the retina MBP is artificially limited to, I believe 16hz. There are firmware hacks floating around to get this number to 24Hz (and above, IIRC).


And yet my MBP refuses to output HDMI above 1920 for me :/


Check your cable's version if you're using the latest MBP. Also, check if your output device can handle it. My dell u3011 can't handle even 2560x1600 through HDMI (which it has 2 for some reason).


I believe it supports it for arbitrary monitors. I was able to do it on my rMBP, at least. You may have to use a third party utility like SwitchResX.


My 2012 15" MBPr can output 4k @ 31hz over HDMI to my 39" Seiki monitor without any hacks on Mavericks.


I'm a bit curious why both the 24" and the 32" are priced over 1k, yet they are able to get the 28" down below that mark. That's not to say that I'm not incredibly excited to see how this ends up! I've been looking for a sub-1.5k 28" monitor with high resolution for a while, and though I love my Apple Thunderbolt Display I use at work, I am hesitant that if I buy one for myself, I'll be buying right before a refresh.

I've been feeling that way for a long while though, so who knows


Unfortunately, the 28” is going to have the wrong pixel density to use with OS X in Retina mode.

Non-Retina MacBooks start at 110 PPI. Cinema Displays have hovered around 100 PPI for about a decade, with the 27” Thunderbolt Display topping out at 109 PPI (same for its sister 27” iMac).

At 28", a 3840 x 2160 panel has a PPI of 157, which, dividing by two to get the equivalent non-Retina PPI, comes to only 78.5, a reduction of over 20% from the lowest-PPI Cinema Displays. This means, on a Mac in Retina mode, everything in the UI will be 20% bigger than normal. Maybe that’s something you could get used to, but it’s far from the much more optimal 24” panel, with a PPI of 184 (92 PPI non-Retina, very close to a 24” Cinema Display).


Ugh. This is why true DPI independence in the OS is so important. There should be no such thing as a "wrong" pixel density.


There used to be a flag for that. Starting in 2006, there was a special "DPI Slider" in Mac OS X that allowed you to set the resolution from 1.0 to 3.0. Each WWDC they claimed that by next year they'd activate it as a user feature, but that never happened. Instead we got retina mode. The reason probably is that they never really got all apps and all use cases to look terrific in all supported scale variants. I remember playing with it a lot from 2007 to 2010 and usually there were always apps (oftentimes even Apple apps, like Textedit) that simply looked awful at 1.3 scale or other variations.


Indeed. Acorn's RiscOS had resolution independence in the 80s with scalable fonts and icons being shipped as vector graphics; it's a pity everyone else is 30 years behind.


I'm afraid you're mistaken. RiscOS certainly had scalable vector fonts (anti-aliased too!), but the icons and interface elements were all bitmaps and the OS was not resolution independent.


I expect there will be scaled modes like there are with the Retina MacBook Pro's. So you'll be able to use a desktop that's effectively 2560x1440 pixels like the 27" Cinema display.

It's going to be pretty heavy on GPU's mind. The rMBP's do it by rendering a 2x density frame and scaling it down for the display. This means the GPU will have to render a 5120x2880 pixel frame and scale it down to 3840 x 2160.


There was a time when DPI and PPI were effectively the same thing (72 per inch). As monitors became more dense (but before retina) this approached 96 and then 130.

If you've got a Linux workstation handy, try manually setting your x86 DPI to match your monitor's specs. Everything will look giant because most apps are written assuming DPI is an arbitrary constant (75 or 96), but most people use monitors that are more dense than that.

When you set your DPI to be its actual value, you'll realize that people have become accustomed to screens being more dense over time, but there was certainly a time when DPIs in the 70s were considered par for the course. In fact, many older people prefer less dense pixels because it's easier for them to see bigger things.


> At 28", a 3840 x 2160

Will look exactly like 1080p in "Retina mode" (more accurately, HiDPI). A 28" 1080p monitor isn't that far-fetched.

You're talking about DPI but not how close one sits to a monitor. People sit a lot closer to 13" MacBooks than to 28" 1080p monitors.


You're right about the importance that distance from the screen plays, but in my experience, a 27-28" 1080p monitor (of which there are quite a few on the market) would have to sit too far away to use for anything but movie viewing and gaming.


I'd bet it's panel type. The 32/24 may be IPS while the 28 may be IGZO.


WHY WHY WHY is this so CONFUSING? DELL! C'Mon man!

The "24" that you mentioned actually seems to also be a 28" screen?

"With the same remarkable, pin-point clarity, the Dell UltraSharp 24 Ultra HD Monitor, users can enjoy color consistency and precision from virtually any angle thanks to an ultra-wide viewing angle on a 28.3-inch screen. "


I think the press release may be incorrect. The "Tech Specs" tab on the product page[1] says the diagonal for the 24 is 23.8 inches.

[1] http://dcse.dell.com/us/en/gen/peripherals/dell-up2414q/pd.a...


28.3

23.8

I think someone just had a minor dyslexic moment


That is almost certainly a typo.


Even so, The difference between the high-end and entry-level models is the former are "Dell UltraSharp XX Ultra HD Monitor" and "Dell XX Ultra HD Monitor".

It doesn't have to be this way.


What way should it be?

Dell: Manufacturer

(UltraSharp): The high-end moniker

XX: Size

Ultra HD: The name for that resolution

Monitor: What it is

Maybe it's because I'm an engineer and not a normal person, but the naming scheme seems quite reasonable. Way better than the utterly indecipherable "Dell 324y5943X" crap you get with some other manufacturers. The resolution name, "Ultra HD", is stupid, but that wasn't Dell's decision to make, and the industry has been naming resolutions like that for twenty years.[0] (VGA, SVGA, XGA, SXGA, etc...)

[0]: http://en.wikipedia.org/wiki/File:Vector_Video_Standards2.sv...


UltraSharp UltraHD is awful. You're right, they have to use UltraHD. But "UltraSharp"? That's what I was getting at.

Also, how about just a simplified product line like, I don't know:

Dell UltraHD 24" Dell UltraHD 32"

Along the lines of Cinema Display.


Because not everybody needs a wide-gamut monitor with plenty of room for calibration at what most users would consider low brightness levels? (Where "low brightness levels" means that monitor-white is not significantly brighter than "bright white" paper under ISO-spec lighting at 95 cd/m^2.) $1K-ish is cheap for a graphic arts monitor (compare with LaCie and Eizo), but it's a heck of a lot more than the average consumer needs to spend for their purposes.


UltraSharp is an established Dell moniker though, been around much longer than UltraHD.

Anyway, as for unifying the product line- Dell being Dell, that would mean simply cutting the UltraSharp product line, which I would consider a loss.


Why do the new UltraHD monitors need to have the same name as previous Dell monitors? It's just a name. In fact, UltraSharp vs UltraHD seems like a great way to distinguish between 4K+ and lower res monitors, while still making both seem "good" to consumers.


Ultrasharp is to Dell monitors what "MacBook Pro" is to Apple laptops. Dell clearly intends, with their 24/32 vs cheaper 28, to have pro quality and mainstream quality displays both at UltraHD/4k resolution the same way they have both pro quality 1080p (ultrasharp) and cheap crappy 1080p (not ultrasharp).

It is unfortunate for dell that the 2160p resolution got labeled something so similar to their existing brand.


Sure. But as someone who has owned several Ultrasharp monitors, I don't feel that it would hurt their brand to come up with a different differentiator for 4K monitors. The brand is Dell, not Ultrasharp. I can almost guarantee that anyone who even knows what Ultrasharp means would be the kind of people to research the specs of a monitor before buying it anyway. So maintaining that 'brand' across all their high-end monitors just doesn't seem necessary.


You don't just toss out branding. When was the last time Apple, king of branding, casually discarded a branding?

Macbook

Macbook Pro

Mac Pro

These names have been with us for, what, nearly a decade?


The press release is saying that the 24"/32" will support 99% of AdobeRGB. So, better gamut on the 24"/32" means more money than the 28".


It would be a heck of a coup if it was the same gamut. One can dream...


I suspect it is because there is much higher expected volume. Not hard to imagine Apple buying a gazillion of these panels.


I'd just be happy to see a 2560x1600 monitor at 120hz or greater. I think we're going to have to max out resolution (to 'retina' or whatever levels where most people can't distinguish additional pixels) before we start seeing monitors commonly supporting greater than 60hz.


The cheap korean monitors can almost do it, but you run into ATI's DVI cap.


Can anyone explain why monitor and TV prices are so vastly different for the same size? Could you not substitute a small flat screen in place of a monitor and save a few hundred?


Are they? AFAICS ceteris paribus, monitor prices are not that different from TV prices. It's only when you want a monitor with specs better than can be found in almost any TV (>HD resolution, high-end IPS or PVA panel) that the prices get higher, for obvious reasons.

EDIT: Also, not nearly all cheap "HD" TVs actually have real 1080p panels. They accept a HD signal but internally resample it to a lower resolution, leading to an image that's obviously blurry in desktop use. With computer monitors, you know you get pixel-for-pixel the image the video card pushes out.


Monitors and TVs are similar but designed with different use models in mind.

I find TV's are bulkier, run hotter, have poor refresh rates, worse viewing angles, and look pretty poor up close. None of this matters when you use it as a TV, of course.

They also usually have massive bezels, harsher lighting, and come on fixed stands (no rotation or vertical adjustment, often not even any pivoting up/down).

You totally could use a TV as a monitor if you wanted. I personally stick with monitors for my desk, TVs for my wall.



For a 2013 update, that is the resolution of his cell phone.


Do you mean in general or for these particular monitors?


In general


much different pixel density and production volumes.


I'm confused. Under $1000 for the 28"? It goes on to say that the 24" is available for $1399...


Look at the model names again, the Ultrasharp are available today for $$$, but announced the Ultra series to be released early 2014 for $$.


What's confusing about it? Apparently the 28" is cheaper to manufacture


I foresee a 4K Thunderbolt Display coming our way within the next couple months. I have been wanting to get a new display for a while, so I am holding out for that. If not, I might consider a Dell. It's not unibody and it's not as thin as the newer Thunderbolt displays are going to be though.


Now I'll be shocked if Apple don't release something similar with the Mac Pro this month.


The Thunderbolt display is desperately in need of a refresh. How do you think a new Thunderbolt will match up with this Dell UltraHD monitor?


I think it's likely that the new Thunderbolt display will be the exact same 23.8" panel, but with updated Thunderbolt expansion hardware (USB 3.0, &c.)


A theory is that Apple is being a large consumer of this display panel. This leads to more of these display panels being produced which leads to a lower price. It seems odd that the other sizes (24 and 32) are much more expensive than this 28.


Will a Macbook Pro Retina 13" be able to drive this?


According to Apple's website, the latest version of the rMBP will be able to drive a 3840 x 2160 display at 30Hz via HDMI.


30Hz sounds like a 'No' to me.


Why? It's not like you're going to play graphically intensive games with it. 30Hz sounds just fine for many types of content, like text (and even videos).


Sort of...

It'll drive it @ 30Hz via HDMI as is today. The hardware also supports 60Hz via DisplayPort 1.2 through the thunderbolt ports and I've heard it works in windows but I don't believe Mavericks has driver support yet.


Yes, easily.


This looks awesome. But I'm wondering, how long until we get an iMac using this panel? I guess the thunderbolt display will get it first, but anyway, exciting news.


I’ve been holding out. Seems silly to buy a Cinema Display right now for my rMBP. USB2 ports, TB1, no Retina, oh, and a Magsafe 1->2 adapter...


So if I understand the tech specs correctly, this 24" monitor is 185 DPI. (24" diagonally, sqrt(3840^2 * 2160^2) = 4406 pixels, 4406px/24" = 185 DPI) By comparison, the MacBook Pro 15" retina has 220 DPI, so it's a respectable monitor. Although for "Ultra HD" I was kind of hoping for something "ultra impressive"...


Being spoiled by apple retina, 32'' 4k does not sould 'ultra sharp' for me. 24'' maybe, so i'll buy one.

Can't wait until 8k monitors come about! These can be both truly ultra sharp, with invisible pixels, and sizeable.


Unfortunately they aren't 3840 x 2400 (which would be 16:10) that is more convenient for coding or anything that isn't full-screen TV viewing.

But still... yay for finally reaching retina-like resolutions on the desktop! It was about the time.


That's what I thought a 4k monitor would cost, and I was actually very surprised Asus priced theirs at like $3,000 earlier this year. Hopefully Dell didn't cut a lot of corners, though.


Even better: the 24" is $1399.


For $1k, you can get /three/ X-Star DP2710s, which are 27" 1440p PLS monitors. A much better deal in my opinion. (http://www.ebay.com/itm/Matte-FREE-EXPRESS-X-STAR-DP2710LED-...)



Any hope of getting full resolution at 30hz on a late 2011 MBPro? I'm using a 30" Cinema Display right now, but it's pretty banged up as I got it from an as-is liquidation auction. Would love to replace it with something newer, lighter and that doesn't raise the ambient temperature of my office by several degrees.


I'm more interested in the 24" model, although the press release goes a long way to convince me it's actually 28.3" display despite the name.

edit: the tech specs page convince me the name is not ridiculously misleading. Seriously interested of this one.


It's a typo.


What part is a typo, though? The 24 or the 28.3?


  Display
  Diagonally Viewable Size:
  60.47 cm
  23.8" (23.8-inch wide viewable image size)
From

http://dcse.dell.com/us/en/gen/peripherals/dell-up2414q/pd.a...


The display is actually 23.8", not 28.3" - so 24" is more or less correct.


Wonder how well various OSes will handle resolution jumps. The embarassing failure to scale GUI objects was acceptable in the technically-limited '90s, but in 2013 it seems ridiculous to have everything pixel-sized.


The problem isn't with the OSes -- it's the legacy software. There's just too much of it written against fixed DPI targets. Apple's pixel doubling (HiDPI) is a hack to get some of the benefit of high density screens without going to full resolution independence. It works really well, in my experience.


All the big ones (Mac, ios, win8, winphone, Ubuntu, android) support pixel doubling. And they're all a long way from dynamic scaling.


Ubuntu does???? I've been keeping an eye on HiDPI support for it for ages, did I miss the release?


I'm underwhelmed, the Lenovo yoga 2 has almost as many pixels and cost less.


It's a 13 inch screen. How are these even comparable?

Is it that you're holding out for an 8K desktop monitor?


the resolution is almost 1:1 comparable. Obviously the I sit much closer to a laptop than i do my 57" HDPC


It's much cheaper to produce smaller displays than larger ones. There's a higher chance for defects as display size goes up.


It should be offset to an extent by the larger cell (or pixel) size. The only reason it's "not", is that manufacturers have invested huge sums of money on production lines incapable of producing larger products while at the same time complaining that their ancient subpar desktop offerings don't sell well.


Are you really knowledgeable about the internals of LCD manufacturing processes to make such sweeping claims, or are you just asserting your general corporations are evil narrative?


I'd be interested in hearing about ways to drive this from an older (MacPro3,1) Mac Pro.


Why are they announcing this now? Is it to tell people to hold off for Xmas?


My guess is either that, or they know Apple is about to release something exceptionally similar with the Mac Pro this month.


Your eyes can't tell the difference b/t 1080p and 4k on a 28" screen


When you say "your", you mean "my", don't you? Also, you seem confused by these TV-related terms and only think of videos and games here; you'd have to be half-way blind not to see the difference between 75 and 150 PPI density with typical font sizes (computer fonts are usually a tad bigger than what's necessary just because of the lousy displays we have).


http://reviews.cnet.com/8301-33199_7-57566079-221/why-ultra-...

Comment sections like this are why HN is called an echo chamber.


The original comment missed the point of the chart in the article you link: the increased pixel density is useless __at TV viewing distances__. When you go down to about 2 feet from the screen, 4k is pretty apparent on a 30 inch screen.

A clearer chart is at [1], and the Carlton Bale's original article even has a handy calculator that lets you verify this [2].

1: http://s3.carltonbale.com/resolution_chart.html 2: http://carltonbale.com/does-4k-resolution-matter/


Thanks for posting this. It wasn't meant to be a snarky comment, but I did leave out some basic assumptions which I thought were obvious. "Usually" the larger the screen the further away most people will sit from there screen. Right now, I'm on a 21" monitor sitting about 3'2" (already out of the 2'6" for added benefit of a 4K screen if you take this chart as gospel) from the screen. If I went up to a 28" monitor, I would most likely move it back at least 6" so that I don't have to move my head and eyes around too much. That would put the screen about 3'8" from me which is within the range of full benefit of 1080p but outside the range of 4K for this size and viewing distance.


How close are you to your monitor(s) at the exact moment you read this?

I'm not at 2 feet with a smaller dual display, and I'd imagine I'd sit further away with gigantic 30+ monitor(s). I know I do when I use a 27. Others at $work mostly sit at least as far away.

In this area of improved vision, what does it gain you? Did all the old hackers not amount to what they could have been because they perceived some less pixels than there could have been?

I don't think a serious argument can be made on this this front until standards reach at least 16k.


Comments like yours are why people say HN's comments are degrading in quality. You can't even grasp the difference in viewing distance of a television versus a computer monitor, and the obvious impact that would have on how noticeable a particular pixel density is, yet you wrote a condescending comment. Unbelievable.

Incidentally, given that displays seem to be continually trending larger, I expect that we'll eventually have an entire wall in our homes that's a display. 4k is just another step toward that.


comment wasn't meant to be snarky, i just didn't think that i would have to give a thesis about why this wasn't newsworthy

but hey if you sit < 2' from a 28" and have no problems with that, then go out and grab you one


see my reply to was_hellbanned's comment


Only if you sit more than 2 meters away from the monitor. Do you?


see my reply to gknoy's comment


At what distance?




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: