They put the "Retina" display in the iMac. This means people will buy it. Higher volume means whoever (LG, I think?) is manufacturing the screens will have to produce more, driving the cost down. That means they will sell variants. Then their competition will also sell competitive options because nobody will want 1080p on a computer screen anymore.
Monitor technology has been stalled for years. This is going to be a gigantic kick in the pants to the industry!
Dell's 4K screens came up here yesterday. Here's what I had to say about those: https://news.ycombinator.com/item?id=8459298
I like DP better than HDMI politically, but if you have a screen with HDMI 2.0 support that's a much safer bet than DisplayPort 1.2 is.
Starting with the Samsung U28D590D, many 4k displays use single stream transport on DP 1.2.
Also, for 5k, you actually have to wait for DisplayPort 1.3 (if you want 60hz), because there's not enough bandwidth, multi stream transport or not. There's a neat bandwidth calculator at .
4k is 11.94 GBit/s without overhead, 5k is 21.23 GBit/s. DisplayPort 1.2 and 1.3 are 17.28 Gbit/s and 25.92 Gbit/s respectively.
This unfortunately means no 120hz 5k displays on even DP 1.3 without compression (which is supported).
Which makes this announcement quite strange. If this iMac doesn't support Thunderbolt 3 (or whatever standard Apple is planning), it'll never be usable as an external monitor.
IMO, it would have been more Apple-y to launch this and refreshed Mac Pros simultaneously with TB 3, allowing Mac Pros to drive two (or more) 5k displays while the rest of the PC world is still tripping over 4k. But I guess this segment isn't a priority for them any more.
With DisplayPort 1.3 now released, I'd be pretty surprised if manufacturers messed around with dual DP 1.2 inputs for displays like Dell's upcoming 5k. Such things are tolerated when there is no alternative, but when it's just a matter of requiring a new video card for a tenth the price of the high end display, that's the only sensible option. This is extreme early adopter tech after all. 5k is yesterday's 4k.
And needless to say, if I'm speculating it's too messy for Dell, Apple won't touch it with a ten foot pole :)
Still, stranger things have happened. If the demand turns out to be there, the products will follow - it's certainly technically possible.
But if DP 1.3 is around the corner, and dual TB would be an ugly hack that is mutually exclusive with the new standard, I could see them steering clear from that.
not true. It has one DVI connector, albeit a dual-link one (the DVI connector supports two concurrent channels).
Source: Me looking at the machine my 30" cinema display is connected to.
In principle, Apple could do something similar again, with an adaptor for two Thunderbolt 2 connectors (from separate buses!) In practice, they'll probably just wait for Displayport 1.3, tho.
I imagine any HDMI 2.0 4k screens to not use it, and even if they are still on DP 1.2, they would start showing a single display.
Now I just need to wait for the consumer version of the rift.
1. http://www.amazon.com/dp/B00KG5LWM4 - eq278c, 27"@2880x1620 = 122ppi, $765
2. http://www.amazon.com/AURIA-EQ308C-30-2880-1620/dp/B00KG5LSB... - eq308c, 30"@2880x1620 = 110ppi, $1075
3. http://www.amazon.com/gp/product/B00JR6GCZA - LG 34UM95, 34"@3440x1440 = 110ppi, $999
I'm in the market for a retina iMac but expect to turn down the virtual resolution a few notches to make the UI a bit more readable. I find the system text on the non retina 27" machines a bit small, but scaling it up on those panels makes it look crap due to scaling artefacts that just shouldn't happen on a retina panel.
Just to clarify: the scaling artefacts are still there; they're just half the size (and one quarter the area).
Apple is very unlikely to go through the trouble of implementing "pixel tripling" just to accommodate people like you and me, but wouldn't it be sweet? :)
: A rectangle 1706 pixels by 960 pixels contains .93 as many pixels as one 1680 by 1050 contains.
EDIT: 4k on ~30" is too high resolution. 4k on ~40" is too big/wide.
Also, for the same price, you can get the same 28" 4k Samsung monitor from Amazon (search for samsung u28d590d).
You mean Windows. KDE is iffy (but I use it just fine and was able to scale my fonts / frames up enough on my 150 DPI notebook), but Gnome in its latest release has fantastic high dpi support.
It is an unmentioned consequence of Windows being a piece of shit in regards to DPI that holds back the desktop display industry, whereas Androids inherent scalability let smartphones race for big pixels.
*disclaimer: MS employee who is just enthusiastic about "viva la resolution."
I really don't need Retina-level resolution, though it'd be pleasant. Maybe in a year or two.
But having a single monitor this wide is really a treat, as I never could stand a dual-monitor setup. One of them is going to be your main monitor, facing you, and one of them will be on one side or the other, somewhat ruining the effect of the combined width.
I would prefer to have one wide monitor, but with a curve.
I'm not sure. I have a 30" widescreen at work I primarily use in isolation (the laptop next to it is pretty puny by comparison), and dualies in a laptop-monitor-monitor arrangement as in Example 2 at home.
The single monitor is wide and requires much scanning, but I think I prefer the experience to multi-monitor window management. I just end up not using the far right screen except in the most necessary situations (design + code + browser)
27" 1440p IPS 120Hz. $450.
Given that Dell's 5k 27" screen is priced similarly to these iMacs, I think the economies of scale could help push some prices down.
Walk through any best buy and that res will far and away be most common.
I for one welcome our new high-DPI overlords.
In the cheapest LCD monitors, the brightness control just changes pixel values and doesn't affect the backlight. Hope this one's better, since the claimed brightness of 350 cd/m^2 is far too bright (you should have less than 120 cd/m^2 in a dark room) and is probably bad for your health.
I'd guess they've fixed it, but there's no way you could tell from their "tech specs", which just list the same huge numbers (1.07 billion colors!) as every other manufacturer's LCD.
It's a little worrying to see a Contrast control, too, since that makes no physical sense.
I've been using them for years for inexpensive yet high quality cables, I thought their reputation was pretty well established. Maybe I missed a scandal out there somewhere.
which i believe is true as they have in my experience the best return policy anywhere.
But OK, ask me again when I will have been using these resolution monsters for a week :-). If ever, I wish :-).
Many of these new monitors use PWM for brightness which gives headaches
I understand this is a big + for monitors. I have used Dell ultra-sharp 32" @ 3840x2160 with display port and it's fantastic. Larger workspace, better gaming, better multitasking!
I care a lot about text, because that's what I consume almost exclusively day in day out at work and in my spare time when I sit in front of a screen. And nothing profits more from high contrast and high pixel density than text rendering.
I just inherited a Dell XPS 13 with a really nice Full HD screen from a coworker who quit. I've got running Ubuntu on it with a scaling factor of 1.5. Not quite Retina, but pretty sweet nonetheless. I was going to use it with an external screen, but after getting used to this laptop I couldn't stand looking at the external screen anymore and got rid of it. Work (programming and writing) is a lot more fun with this setup.
My spouse just got a 13 inch Retina MBP. She's pretty stoked as well. Even her half blind mom can see the difference.
High resolution screens are great and as sooner they become standard the better.
TCON's that can handle a single logical 4K display are just hitting the market, and Apple has one shipping today to handle 5K. A timing controller may be simple tech, but it's cool that they're so far ahead of the curve.
The only thing that irks me a little is the muscle memory for using the option key for cut-copy-paste in the UI vs the actual ctrl key (ctrl-c etc) in a terminal...
But a shim interface to make one component work with another? Thats the business of millions of companies out there.
Apple is actually pretty awesome like this.
They are better than anyone else at that simply because nobody else really sees the whole product as their own problem. I.e. each hardware/software/service maker is just optimizing their part.
Obvious in retrospect, but just another example of how Apple treats their product range as a complete ecosystem.
Few companies can get away with such a relatively narrow selection of hardware products. Dell for example has dozens of laptop models. By only having a few models, Apple making a big change to one results in a big chunk of the overall market including that feature.
Perhaps what happens is they just get resold as Non-Apple Displays. So, the no dead pixel displays go into the premium Apple monitors, but the Class I devices get resold in the white-label market. (This is where you see these great deals on eBay).
I have one on my 15" rMBP, and I do not notice it unless I put my face within 6" of the screen.
Once you notice where a dead subpixel is though, your eye goes straight to it every time it's visible.
Your conclusion does not follow from your premise.
Just because we can't identify individual pixels does not mean a single pixel has no influence. It shapes the overall picture.
If a single dead pixel is not visible at all, why have that pixel there in the first place -- working or not?
I just tested with my iPhone 6 (a 1334x750 px white picture with a single black pixel in the middle of the picture), and the single black pixel is definitely visible. A screenshot of the iPhone showing the picture in question confirms it's a single pixel on the screen.
I wonder if this is actually true, the whole "more pixels than photosensitive cells" thing sounds a bit shaky.
Second, the number of rods and cones has nothing to do with it.
The "retina" argument was about having smaller than discernible angular pixel sizes, which is true for the majority of people and normal viewing distances.
Going to 5K only about doubles the number of pixels, so it sounds like a solvable engineering problem.
Panels are ready and GPUs are ready; the link layer is why you can't go to Best Buy and buy a 5k monitor. DisplayPort and HDMI both suck.
I will give my firstborn for an 8k* monitor with a 120Hz refresh rate. He or she will probably have kids when I get that, though.
(4k is not that compelling. At 32" it's only 140ppi, which is nothing approaching "retina" levels.)
...one would hope that you're not viewing the 32" 4K monitor at the same distance you would view your phone from. Comparing PPI of a 32" screen to a 6" one is rather meaningless as the viewing distance is going to be vastly different.
The ultimate watermark will be retina-level resolution for devices like the Oculus Rift, allowing you to move your eyeballs all around and see real-life-quality graphics.
The Chromebook Pixel has a nice resolution. That's 240ppi.
Which is where you demonstrated that you are a power user. Most consumer systems don't even come with discrete graphics cards anymore.
Consumer systems took a big backwards leap when integrated graphics became the norm again. They're running several years behind the discrete cards someone like you or I might buy.
Also those games aren't too graphically demanding tbh, l4d2 is the newest one and it's over 5 years old.
So I rather have the choice among powerful GPUs, with drivers kept up to date to the latest OpenGL versions (separately from OS versions), then having retina support.
I'm thinking in the same terms I used to think when I was doing home audio. My boss had a set of $10K speakers. Sure, they sounded great, but then I listened to a clients $5k speakers and couldn't tell the difference between his and my bosses speakers. It was as if my ears weren't finely tuned enough to tell.
Of course if you ask my Boss about the difference, he'd take an hour to tell all the differences. To me, the lay person at the time, I couldn't tell.
The X axis should go from 10 to 60, the Y axis should go from 1ft to 30ft, max.
As it is, it's not all that useful for figuring out what the density limits of a desktop monitor are. (Average desktop monitor sizes occupy maybe 1/12 of the X axis!)
But gee, one length (viewing distance) is in different units than the other length (screen diagonal size). Hilarious.
It doesn't really say, but I would guess it has to do with when you start to see individual pixels.
Then simple trig tells us that we are maxed out when:
(distance to screen) * sin(0.3 arcminutes) = pixel size
The new iMac has a 27 inch diagonal, and 5874 pixel diagonal, so its pixel size is 0.004596 inches. Plug that in above, and you can see that you would have to be 53 inches from the screen to be maxing out your visual abilities. Anything closer (like normal) and you actually could still benefit from higher resolution.
>did a custom controller for the display timing (Timing Controller (TCON))
Apple doesnt produce displays, they dont design TCONs - Tcon is a pcb sitting directly behind glass driving individual crystals, and is designed and manufactured by the same company making the display. Also every Tcon is custom, and build to drive particular type of screen.
Yes, when Dan Rissio says "we manufactured this screen' he is LYING TO YOU. They bought whole thing from LG or Sharp.
AMD doesnt support DP 1.3 yet, so only way for 5K resolution is bonding two 3K screens together, this is so innovative IBM did it 15 years ago in T220.
> retina type rendering
you mean scaling down? this is what GPUs do best, out of the box.
Yes, this screen is amazing. But dont act like its something revolutionary touched by Noodly Appendage.
You really think the largest company in the world (by market cap) doesn't actively collaborate with its source companies and do custom designs created by its internal engineers together with the supplier's engineers?
It's not a lie if "we" refers to Apple together with its partners or "manufacture this screen" refers to the combination of LG/Sharp's screen with Apple technology.
Is it a lie when Google says it created Android without mentioning the creators of Linux or Java? Is it a lie if Intel says it manufactures a CPU without mentioning the manufacturers of the silicon wafers?
Dude, it's marketing, which I'm not a fan of. But be fair.
You have really swallowed the Apple hateraide it seems.
In other words, buy a Retina Cinema Display, get the computer to power it for free.
I can understand why Apple is not in a hurry to sell this just as a display. This display could likely not be driven at all by all of their most popular Macs: all the MacBooks and the Mac mini. Maybe it would work with the Mac Pro, but that’s it. And that Mac Pro is probably not the best selling Mac, by a large margin. So they would basically sell this display that only works with a Mac Pro.
I can understand how that might seem like a wasted effort until at least a couple more widely selling Macs support the display. Maybe when the MacBook Pros can do it they will start selling just the display.
With the iMac the advantage is that Apple is delivering it with a computer that definitely can push those pixels, so you dodge all compatibility woes that will plague these high-res displays for a couple more years. It seems like the still had to get in there and do some weird custom stuff and maybe that’s just hard or impossible to do if you have to do it in a display that is hooked up to some random GPU from Apple’s past Macs.
This was very nice when the view connection was DVI, but the current ThunderBolt connections limit it to other systems with ThunderBolt and the appropriate drivers. This limits it to pretty much using the display with other Macs.
My quick review of the Apple's tech spec for the new iMac doesn't say if it has this feature.
"The new Retina iMac does not work as an external display."
Not sure about having to be an Apple keyboard- probably works with any keyboard as long as you know which key is mapped to command. (Usually the "OS" key).
DP1.2 however only supports 4k resolutions.
An artist friend of mine has what is essentially a white cotton muslin curtain that goes around their work area which keeps reflections down to near zero (without being in the dark) and only uses glossy displays. It would drive me nuts but I see the attraction.
AR coatings on touch screens are problematic because skin oils make visible marks, but on non-touch glossy screens they should be standard. Indeed, back in the CRT era, it was routine for high-end CRTs to be coated.
I have noticed that Apple is using a coated screen for at least one model of MacBook Air, but for some reason they don't seem to have fully embraced coated screens.
I was really against buying the glossy screen but the matte looks washed out compared to the glossy. If your display is around 60% or above brightness then you really dont have a glare issue.
And if you're using a laptop outside of a home/personal office environment (which laptops are presumably built for), you don't have control over lighting either.
For tablets and mobile devices which generally aren't used for working at long stretches, I guess I can see the appeal. But it's really strange to me that glossy is popular on machines people do use for long-stretch work -- and often in high-glare overhead lighting open-plan offices.
(And not only popular, but increasingly the only available option.)
The annoying thing is that it's nearly impossible to find a glossy screen on the market.
It has a lot of compatibility problems, and is hacky as hell. I doubt Apple would adopt this as an official/first-class solution to anything.
Solving "compatibility problems" is what Apple do best by requiring you have an Apple Mac Pro and an Apple Cinema Display and not supporting any other configuration
When I made an exception and bought a notebook with glossy display I regretted that a short time later. Glossy display? Never again!
Funny thing, it was also the case for the first 27" imac
Update: it works in 60Hz @ 4K.
The scrolling transition at the top of this page really impressed me.
Monitors keep getting wider, web pages keep getting narrower and longer :-)
I also like how (about 3/4 down the page) they capture the background iMac image on the left and keep it "sticky" during scroll, but not completely stationary -- it still responds and gives slight feedback, which makes it 99% less annoying. Nice.
"4k monitor": 3824 x 2160
"5k monitor": 5120 x 2880
That's a lot of pixels.
It might be a good idea to be skeptical about spending >$1,500 on a 27-inch monitor in Q4 2014. It's difficult to notice any pixelation on a 27" screen at a resolution of 2560x1440, so clearly the reason to upgrade to 5120x2880 is for the extra screen workspace. But unless you have very good vision, you're probably not going to be able to read text at 5120x2880 without zooming. What's the advantage?
For $1,000 you can buy two 27" 2560x1440 monitors, which is a huge amount of workspace. Also, a single $300 midrange GPU can drive both monitors at full resolution. A couple years ago, that was cutting-edge tech, but it cost ~$2600. Also, two monitors offer a better user experience than one monitor, since window management is a bit easier.
Would anyone mind explaining whether the pros of a 5k monitor outweigh the hefty pricetag?
Not once you are used to a retina display. I have both a retina Macbook Pro, and a 27" iMac. It is actually really obvious, especially when looking at text.
> you're probably not going to be able to read text at 5120x2880 without zooming
This is correct, unless you use the standard retina resolution which is like 2560x1440 except doubled, so everything is the same size just much nicer looking.
It would be very nice to be able to display full 4K detail or side-by-side 1080p videos for editing purposes. Photo editing will be great too, though you will have to resort to 'physical zoom' (i.e. moving your head closer) to do serious pixel peeping, rather than the old and probably superior technique of blowing up the pixels to larger than you'd see them normally.
For gaming and such.... I'm not really sure. Downsampling is now the name of the game, i.e. rendering above the display rez and dropping it down, which has nice effects on IQ. Rendering 14.7 million pixels is a hell of a task to start with, and then you start getting all manner of masks, per-pixel effects, etc... the demands really multiply. Plus there's also a focus on high framerates, like 120, which we're not going to see at 4K or 5K for a gooood long while. It'll be years before things catch up at the mid-tier level of processors and GPUs. And 8 GB of RAM, I hardly need add, is entirely insufficient.
tl;dr: for today's flat desktop purposes, probably not necessary but could be nice. for gaming and other purposes, probably reaching too far. but it's still a nice trend.
Although I'm not sure how the display quality compares, that'll be for e.g. Anandtech to test. And other 5k monitors are in the same price range as the imac (except without the computer part)
On the new 5K iMac you won't perceive a bigger workspace^ than you had on the previous (2560x1440) model, because all UI elements will be scaled up at the factor of two, practically making everything look twice as sharp.
^: Actually, just like with the retina MacBooks, you'll have the option to use alternative resolutions up to 5K with and without scaling, effectively creating a larger virtual workspace.
I never had this problem before getting the rMBP however, so maybe I'm just getting picky!
But if all you are doing is text editing, you may be right about the value.
On the other hand if you are doing 4K video editing or designing mobile apps, this is a great tool.
My predication is that once we see Apple deploy Thunderbolt 3 / DisplayPort 1.3 (capable of 25.9Gbps usable bandwidth ), we'll see the Apple Cinema Display Retina.
DisplayPort v1.2 can carry around 17.3Gbps (this is minus overhead), so you would need DP v1.3 with an effective rate of 26Gbps to support native at 5k without compression .