Hacker News new | past | comments | ask | show | jobs | submit login
iMac with Retina 5K display (apple.com)
615 points by davidbarker on Oct 16, 2014 | hide | past | favorite | 415 comments



Thank you, Apple!

They put the "Retina" display in the iMac. This means people will buy it. Higher volume means whoever (LG, I think?) is manufacturing the screens will have to produce more, driving the cost down. That means they will sell variants. Then their competition will also sell competitive options because nobody will want 1080p on a computer screen anymore.

Monitor technology has been stalled for years. This is going to be a gigantic kick in the pants to the industry!


The main concern with consumer adoption of separate 4k/5k+ screens will be dual-link dvi and hdmi 2... since standard hdmi can only drive 4k at 30hz refresh.


Or you can use DisplayPort. Nothing wrong with that alternative, license-free standard.


Wait for DisplayPort 1.3 for this. It currently uses multi-stream transport to feed the display as if it's two half screens daisy chained together. Sometimes it works, other times it's a trainwreck. At least know what you're getting in to.

Dell's 4K screens came up here yesterday. Here's what I had to say about those: https://news.ycombinator.com/item?id=8459298

I like DP better than HDMI politically, but if you have a screen with HDMI 2.0 support that's a much safer bet than DisplayPort 1.2 is.


Actually, it's not intrinsic to DisplayPort 1.2 that 4k displays must be driven with MST, it's just that for a long time the electronics to decode the full signal were not available, hence the hack.

Starting with the Samsung U28D590D[1], many 4k displays use single stream transport on DP 1.2.

Also, for 5k, you actually have to wait for DisplayPort 1.3 (if you want 60hz), because there's not enough bandwidth, multi stream transport or not. There's a neat bandwidth calculator at [2].

4k is 11.94 GBit/s without overhead, 5k is 21.23 GBit/s. DisplayPort 1.2 and 1.3 are 17.28 Gbit/s and 25.92 Gbit/s respectively.

This unfortunately means no 120hz 5k displays on even DP 1.3 without compression (which is supported).

[1] http://www.pcper.com/reviews/Displays/Video-Perspective-Sams...

[2]http://emsai.net/projects/widescreen/bandwidth/


Oh, good to know. I'd assumed the MST screens were because they had to use it, but I guess I just jumped on the 4K train a few months before I should have...


So does that mean no 60Hz 5K displays for the 2013 Mac Pro?


Correct. Thunderbolt 2 includes support for DP 1.2[1]. And in any case, only has 20 Gbit/s of bandwidth in total, which isn't enough for 5k even without taking overhead into account.

Which makes this announcement quite strange. If this iMac doesn't support Thunderbolt 3 (or whatever standard Apple is planning), it'll never be usable as an external monitor.

IMO, it would have been more Apple-y to launch this and refreshed Mac Pros simultaneously with TB 3, allowing Mac Pros to drive two (or more) 5k displays while the rest of the PC world is still tripping over 4k. But I guess this segment isn't a priority for them any more.

http://en.wikipedia.org/wiki/Thunderbolt_%28interface%29#Thu...


Not even if they use 2 thunderbolt 2 channels to drive it?


The bandwidth would certainly be there, and there's precedent with monitors requiring dual HDMI to get to 60hz, but I'm not aware of the same thing being done with DisplayPort.

With DisplayPort 1.3 now released, I'd be pretty surprised if manufacturers messed around with dual DP 1.2 inputs for displays like Dell's upcoming 5k. Such things are tolerated when there is no alternative, but when it's just a matter of requiring a new video card for a tenth the price of the high end display, that's the only sensible option. This is extreme early adopter tech after all. 5k is yesterday's 4k.

And needless to say, if I'm speculating it's too messy for Dell, Apple won't touch it with a ten foot pole :)

Still, stranger things have happened. If the demand turns out to be there, the products will follow - it's certainly technically possible.


The 30" Cinema Display had 2 DVI connectors, which was nasty but necessary. So maybe Apple wouldn't be too disgusted by the thought of dual TB.

But if DP 1.3 is around the corner, and dual TB would be an ugly hack that is mutually exclusive with the new standard, I could see them steering clear from that.


>The 30" Cinema Display had 2 DVI connectors, which was nasty but necessary.

not true. It has one DVI connector, albeit a dual-link one (the DVI connector supports two concurrent channels).

Source: Me looking at the machine my 30" cinema display is connected to.


Interestingly, the first one had one connector, with two DVI channels, _before dual-link DVI existed_. Remember this thing? http://en.wikipedia.org/wiki/Apple_Display_Connector

In principle, Apple could do something similar again, with an adaptor for two Thunderbolt 2 connectors (from separate buses!) In practice, they'll probably just wait for Displayport 1.3, tho.


Patently false claim. The DVI standard always included dual-link, even before Apple added USB and power connections, changed the connector, and passed it off as a unique invention of theirs.


Huh, weird. Don't know why I thought that.


According to a review[0] I read the new Dell 5k display has 2xDP1.2, so there is hope :)

[0]: http://www.maximumpc.com/dells_5k_monitor_pre_reviewed_2014


You don't have to use MST, just that most 4k panels manufactured intentionally mimic two panels so they can output two hdmi 1.4 displays to composite into the whole screen.

I imagine any HDMI 2.0 4k screens to not use it, and even if they are still on DP 1.2, they would start showing a single display.


Sadly, in practice there appear to be many things wrong with DisplayPort. I’m writing this on a machine using a workstation-class graphics card to drive two high-end Dell monitors over DisplayPort. In some respects — starting with basics like what happens when you turn things on and off — it is the least satisfactory video set-up I’ve used in many years, despite being more expensive than the last several put together.


In my experience, for a 2560x1600 samsung monitor, DisplayPort works well. Admittedly, the cable seems to stop working after about 6 months, since I plug and unplug from my laptop twice every day, but a cable replacement works.


In my experience DP doesn't work reliably @ 4k, at least on AMD & Intel GPUs. If you do a bit of Googleing you'll see that there are many problems with DP at super high resolutions. I personally went through 5 different brands of DP cables, a DP MST HUB and all sorts of other remedies and fixes and nothing works reliably. I have 3 ASUS PB278Q and the only input that works reliably is DVI.


Except the heinous physical connector.


I just bought a GTX 980 because it supports HDMI 2.0, I am now running my HTPC at 4096x2160 @ 60hz over HDMI. As of right now the only graphics cards you can buy that support HDMI 2.0 are the GTX 970 & 980.


Just build myself a rig for the first time in a decade, with a pair of 980s SLI'd... it kicks like a mule... although unfortunately my AV receiver seems to support "HDMI 2.0 standard", rather than HDMI 2.0, which is a kick in the nuts.

Now I just need to wait for the consumer version of the rift.


Where did you find a GTX 980 in stock?


If you are lucky enough to live near a MicroCenter -- they seem to have gotten good stocks and still have a few kicking about.


Yeah, that's what I plan on doing this weekend. Wasn't planning to upgrade my GTX680 for a while as it was plenty for GPU acceleration in Adobe apps and for gaming at 2560x1440 but the Rift pushes it to its limit in several applications. Thought about just getting another 680 to put in SLI but my stupid past self didn't buy an SLI motherboard and it's not worth the hassle to buy another mobo, another 680, and rebuild the thing now that the 980's are out.


Got mine from B&H Photo, went for the ASUS.


What display/tv are you using?


LG 65UB9500 with WebOS, it's a pretty great TV, I really like the WebOS interface.


this was my first thought too... considering i had to hack my $6000 D700 mac pro into displaying 4k at 56hz instead of 60hz to avoid ridiculous noise/tearing on half the screen... I personally wouldn't touch 5k for another year at least. No way of knowing because it's not mentioned in the specs, but i'm guessing this new mac is limited to 30hz.


Why would you even think this since they aren't limited to DVI or displayport?


hdmi 2.0 can drive 4k at 60fps max, won't be able to drive this 5k iMac


I'm still waiting for a good non-retina ~30" 3k monitor. I'm currently using a 2560x1440 (HP zr2740w), and horizontal space is a little tight for dual windows. Retina doesn't really matter for me (I got the retina MBP, but using it for 1680). The only options seems to be the ones linked below, but there aren't any reviews on the 2880 ones, and the 3440 is too wide and only 1440 high, not ideal for dual windows.

1. http://www.amazon.com/dp/B00KG5LWM4 - eq278c, 27"@2880x1620 = 122ppi, $765

2. http://www.amazon.com/AURIA-EQ308C-30-2880-1620/dp/B00KG5LSB... - eq308c, 30"@2880x1620 = 110ppi, $1075

3. http://www.amazon.com/gp/product/B00JR6GCZA - LG 34UM95, 34"@3440x1440 = 110ppi, $999


Bear in mind you may have set the display virtual resolution to 1680, which affects apparent font and icon sizes etc, but the phydical resolution it's being rendered at is still the full retina fidelity. You shouldn't be seeing jaggies on curves the way you would on a real 1680 display.

I'm in the market for a retina iMac but expect to turn down the virtual resolution a few notches to make the UI a bit more readable. I find the system text on the non retina 27" machines a bit small, but scaling it up on those panels makes it look crap due to scaling artefacts that just shouldn't happen on a retina panel.


>makes it look crap due to scaling artefacts that just shouldn't happen on a retina panel.

Just to clarify: the scaling artefacts are still there; they're just half the size (and one quarter the area).


Quite right, I should have said "shouldn't be visible".


Like you, I'd prefer to turn down the virtual resolution a few notches to make system text and UI elements bigger. I cannot help but notice that 1706 by 960 would be enough virtual resolution for me[1], which on the new iMac would put 9 actual pixels into every virtual pixel. It would be nice if Apple included system fonts and other UI elements adapted specifically for such a 1:3 ratio of virtual size to actual size, which we might call "pixel tripling" in contrast to the "pixel doubling" that Apple has already implemented (a.k.a., HiDPI mode) in which each virtual pixel is made up of a square of 4 actual pixels.

Apple is very unlikely to go through the trouble of implementing "pixel tripling" just to accommodate people like you and me, but wouldn't it be sweet? :)

[1]: A rectangle 1706 pixels by 960 pixels contains .93 as many pixels as one 1680 by 1050 contains.


Monoprice is selling UHD 4K 3840x2160 60hz for $500-600... I've seen similar ones from Samsung on Amazon. http://www.monoprice.com/Product?c_id=113&cp_id=11307&cs_id=...


I want 3k, not 4k. I haven't heard any 4k monitors that can downsize to 3k w/o aliasing and/or without first upscaling to a higher resolution.

EDIT: 4k on ~30" is too high resolution. 4k on ~40" is too big/wide.

Also, for the same price, you can get the same 28" 4k Samsung monitor from Amazon (search for samsung u28d590d).


> 4k on ~30" is too high resolution.

What?


He's saying that OS controls don't scale well on the desktop.


> OS controls

You mean Windows. KDE is iffy (but I use it just fine and was able to scale my fonts / frames up enough on my 150 DPI notebook), but Gnome in its latest release has fantastic high dpi support.

It is an unmentioned consequence of Windows being a piece of shit in regards to DPI that holds back the desktop display industry, whereas Androids inherent scalability let smartphones race for big pixels.


Ah, Windows* is catching up. But the ecosystem (third and some first-party applications) has been admittedly quite slow in making their applications HiDPI friendly. Metro/Modern apps for use on smartphones and tablets is already quite good, desktop apps are catching up.

*disclaimer: MS employee who is just enthusiastic about "viva la resolution."


why not first upscaling ? Your (and mine) rMBP do the same thing and that makes 1680 look a lot better than native 1680.


I've been using the LG 34UM95 for a month or so attached to a rMBP, and find the ultra-wide screen (3440) delightful. The 1440 height is the same as the current Apple T'bolt Cinema Display.

I really don't need Retina-level resolution, though it'd be pleasant. Maybe in a year or two.

But having a single monitor this wide is really a treat, as I never could stand a dual-monitor setup. One of them is going to be your main monitor, facing you, and one of them will be on one side or the other, somewhat ruining the effect of the combined width.


Actually, having a second monitor to the side at an angle does wonders for neck strain. A single flat monitor, in contrast, means that the edges are going to be pretty far away from your head. Of course, that could be fixed with a curved monitor, which are coming, though the curves aren't very dramatic yet.


Well, having a single wide monitor means I move my neck very little from side to side (maybe 15 degrees each side?). Even my old (60) eyes have no trouble viewing the edges.


We can visualize some geometry here using some highly appropriate ASCII art (-, /, and \ are screens, * is the user):

  -------
     
     *
vs.

    ---
   /   \
     *
The first one requires you to move your neck much more than the last option; it also provides a better peripheral view. I have a corner desk, so my setup looks like:

    ---
   / 
     *  
     
More or less; I also have a chair that pivots, so I can change my focus quickly without moving laterally. What is really painful is when, to reduce horizontal span, the monitors are thrown into portrait, then you have up-down neck strain...which is very painful! I actually decided against another 24" as my second monitor for this reason (I have a secondary 21" wide screen in portrait as my secondary, nice for reading PDFs and email).

I would prefer to have one wide monitor, but with a curve.


Here you go: http://www.lg.com/au/it-monitors/lg-34UC97

I'm not sure. I have a 30" widescreen at work I primarily use in isolation (the laptop next to it is pretty puny by comparison), and dualies in a laptop-monitor-monitor arrangement as in Example 2 at home.

The single monitor is wide and requires much scanning, but I think I prefer the experience to multi-monitor window management. I just end up not using the far right screen except in the most necessary situations (design + code + browser)


I really don't see how the second option change anything for your neck, on the opposite it is pretty clear on the picture that it won't change. However the second setup will allow you to be perpendicular to the screen on a wider area, reducing the average angle between your eye and the screen surface. It might be more comfortable for some people from a perspective point of view, but not for the neck.


I'd like to second that - I've been using the curved version (LG 34UC97-S) for 1.099 € with a mid-2014 rMBP and it is awesome. I'm not sure if the curving is necessary, but it certainly is nice.


Overlord x270 OC.

27" 1440p IPS 120Hz. $450.


Not so sure about this. We still mostly have laptops with shit 1366x768 monitors despite "retina" or at least "full hd" screens being around for quite a while...


I bet it's a bigger deal for external monitors. A ton of people buy laptops just saying "Oh, 15 inch screen for cheap, that sounds perfect!" But the sort of consumer who buys an external monitor tends to know what "resolution" means. There's demand for better ones because nobody buys a 27" 1366x768 display.

Given that Dell's 5k 27" screen is priced similarly to these iMacs, I think the economies of scale could help push some prices down.


Plenty of people seem to buy 27" 1080p displays though, which is arguably worse than 768 on the 13-14" screen of a typical cheap laptop.


I like small laptops and this is the bane of my life, there are so few powerful small laptops out there with good screen resolutions.


I feel like laptop resolution progress stalled in the early 2000s. Maybe mass knowledge of 1080p HD set us back. I guess consumers started thinking "1920 x 1080 is really good resolution" so they were willing to accept lower resolutions as "good enough" for a long time. I had a 1400 x 1050 14" Dell laptop in 2005 but it wasn't until the Retina Macbook Pro in 2012 that laptop resolutions got significantly better. 768 is an absurdly bad vertical resolution. Many serious content creation apps do not work well at all at such a low resolution.


It doesn't help that a large number of Windows apps don't handle high DPI well.


Well and it doesn't help that in the Win7-Win8 timeframe, Microsoft was pushing 1366x768 as THE resolution that machines should be using.

Walk through any best buy and that res will far and away be most common.

I for one welcome our new high-DPI overlords.


Blame Intel for that. When Intel's chips represent a whopping 40 percent of the machine's BOM, there really isn't that much room left for a high-resolution IPS display.


I can't wait to see these available from off-brand Korean resellers. I wonder how much cheaper they'll be.


Similar sized 4K UHD 60hz monitors have been available between $500-$600 for a while now, Monoprice for example: http://www.monoprice.com/Product?c_id=113&cp_id=11307&cs_id=...


Previous Monoprice monitors didn't even have backlight controllers: http://www.anandtech.com/show/7240/monoprice-zerog-slim-27-i...

In the cheapest LCD monitors, the brightness control just changes pixel values and doesn't affect the backlight. Hope this one's better, since the claimed brightness of 350 cd/m^2 is far too bright (you should have less than 120 cd/m^2 in a dark room) and is probably bad for your health.

I'd guess they've fixed it, but there's no way you could tell from their "tech specs", which just list the same huge numbers (1.07 billion colors!) as every other manufacturer's LCD.

It's a little worrying to see a Contrast control, too, since that makes no physical sense.


Brightness is not even close to being enough on modern displays. Looking at a 700nit display outside is still at most only half as bright as looking at a sheet of paper in the same light conditions. 350nit is nowhere near being bad for your health. I can only recommend to try out a high luminance display.


Or if you want one from a more reputable manufacturer and retailer: http://www.amazon.com/Philips-28-Inch-Monitor-resolution-288...


Is monoprice considered disreputable?

I've been using them for years for inexpensive yet high quality cables, I thought their reputation was pretty well established. Maybe I missed a scandal out there somewhere.


I have nothing except good things to say about them, and have two of their UHD monitors.


I think the post is referring to UHD vs Phillips and not Monoprice vs Amazon. I've used Monoprice a few times in the past and they've been very good.


CrystalPro is the "brand" here, UHD is the resolution UltraHD: https://en.wikipedia.org/wiki/Ultra-high-definition_televisi...


i never said monoprice was disreputable. i said amazon was "more reputable"

which i believe is true as they have in my experience the best return policy anywhere.


monoprice is universally loved in my experience


The problem is, most GPUs can't drive that many pixels, which wasn't the case with the 1440p ones. So there is now a new set of issues with displays like these being used standalone.


I hope some of those displays will be as matte as the old TFTs. I won't buy even an 8K display if it is glossy. For the same reason I won't buy the 5k imac.


Sitting in front of a Dell U2713HM being perfectly happy. Sharpness, fontsize, movies ... seems all perfectly.

But OK, ask me again when I will have been using these resolution monsters for a week :-). If ever, I wish :-).


My dual 2412Ms (noticeably less resolution than yours, but still) sit beside a 15" Retina MBP. It is painful to switch between the screens. First world problems, but the good kind.


Same here, plus brightness near zero with no flicker because no PWM

Many of these new monitors use PWM for brightness which gives headaches


And I assume you also mean content providers will also be forced to shoot their videos in 4k/5k resolution? (I don't think so)

I understand this is a big + for monitors. I have used Dell ultra-sharp 32" @ 3840x2160 with display port and it's fantastic. Larger workspace, better gaming, better multitasking!


I don't care that much about video. Full HD is plenty fine with me.

I care a lot about text, because that's what I consume almost exclusively day in day out at work and in my spare time when I sit in front of a screen. And nothing profits more from high contrast and high pixel density than text rendering.

I just inherited a Dell XPS 13 with a really nice Full HD screen from a coworker who quit. I've got running Ubuntu on it with a scaling factor of 1.5. Not quite Retina, but pretty sweet nonetheless. I was going to use it with an external screen, but after getting used to this laptop I couldn't stand looking at the external screen anymore and got rid of it. Work (programming and writing) is a lot more fun with this setup.

My spouse just got a 13 inch Retina MBP. She's pretty stoked as well. Even her half blind mom can see the difference.

High resolution screens are great and as sooner they become standard the better.


I am really curious about the technology behind the 5k iMac. I am not sure if there is any off the shelf GPU out there that can drive that display using retina type rendering. It's interesting they did a custom controller for the display timing (Timing Controller (TCON)) . They must have had to do deep customizations to use the AMD R9 M290X (comparable to the Radeon HD 7870) to drive it. If this is not innovative I am not sure what is, in terms of an engineering standpoint.


It's very impressive Apple made the effort to do this now instead of waiting for 'off the shelf' parts but the technology behind this is probably nothing to write home about. The biggest limitation of display bandwidth is currently external HDMI/DP standards which are slowly ratified and adopted. For all-in-ones and laptops additional channels can be added and/or clocked higher for additional bandwidth. Apple's TCON is probably a fairly standard part souped up a bit for higher performance to handle the additional channels / clock speeds. I'd be surprised if they are doing anything fancier than this because the jump from 4K to 5K isn't big enough to require it. Close enough that the 'old/current technology' can be stretched. On the video card side as long as the 7870 has enough VRAM to store frames desktop 2D / light 3D compositing isn't that demanding. The trade off here is heavy 3D (modern gaming) at native 5K or even down to 4K is mostly not happening on this machine. Probably a good choice since 2560x1440 at 120Hz+ is a better choice for gaming than higher resolutions at 60Hz.


The jump from 3840x2160 to 5120x2880 is nearly 2x, in pixel count. That is not just a 'tweak existing technology' kind of jump.

TCON's that can handle a single logical 4K display are just hitting the market, and Apple has one shipping today to handle 5K. A timing controller may be simple tech, but it's cool that they're so far ahead of the curve.


It's a big jump in pixel count but a 2X increase in bandwidth can be brute forced / tweaked out of existing technology similar to DL DVI or DP MST. IMO that doesn't diminish it at all because lots of technical problems have somewhat obvious solutions but actually making it happen is still difficult and expensive. Even more so if you want to avoid making any ugly trade offs in the process.


I suppose the custom hardware means there is little chance of getting this to run Linux any time soon?


Not sure, though I really don't mind OSX, and with homebrew, it's pretty close to a 1:1 for most of my use on OSX...

The only thing that irks me a little is the muscle memory for using the option key for cut-copy-paste in the UI vs the actual ctrl key (ctrl-c etc) in a terminal...


You can easily remap it in System Preferences: http://support.apple.com/kb/PH13742


But, then I'm pressing command+c in the terminal window.


That's easy to fix. Use iTerm2, and tell it to remap modifier keys (to unflip them). If you use mission-control shortcuts (like move workspace), configure those keys in iTerm2 to "don't remap modifiers"


You want Karabiner https://pqrs.org/osx/karabiner/


Karabiner is amazing. I love that I can finally recommend it to people and they don't say "I have an iMac…"


@leephillips - good news is that linux installs fairly painlessly on the iMac 5K. I'm running Fedora 20 and it's going really well. Some slight funkiness with sound but I'm sorting that out now. Need hardware accel for gnome shell so required this workaround for the ATI drivers: https://bugzilla.redhat.com/show_bug.cgi?id=1054435


The custom hardware sits in front of an apparently stock standard AMD Radeon video card chip. I doubt it requires any software driver implementation.


VirtualBox is really good, and free.


Well, what is innovative is building an actual 5k display. Or the Intel processors at ever smaller size.

But a shim interface to make one component work with another? Thats the business of millions of companies out there.


It may not be innovative, but it's something good that makes Apple consistently leapfrog everyone else on everything else than raw processing power and value. I mean, who else would have done this? Dell? Nope.

Apple is actually pretty awesome like this.


Exactly. Apple tries to do a lot of firsts when they release a new product category, but most of their updates are about putting the effort into the smallest details that matter to users, even if the users don't know about it.

They are better than anyone else at that simply because nobody else really sees the whole product as their own problem. I.e. each hardware/software/service maker is just optimizing their part.


My favorite example of this: a MacBook's headphone socket has full support for iPhone earbuds. The little headset microphone is used and the volume/pause buttons operate exactly as on an iPhone.

Obvious in retrospect, but just another example of how Apple treats their product range as a complete ecosystem.


I think the issue is less that only Apple is capable of making these technological leaps, and more that only Apple is capable of pushing them to a wider audience.

Few companies can get away with such a relatively narrow selection of hardware products. Dell for example has dozens of laptop models. By only having a few models, Apple making a big change to one results in a big chunk of the overall market including that feature.


I would argue that producing the first 5k display that will be purchased by the 'masses' is the real innovation here.


I'm not sure. Why doesn't everyone produce them? The PC industry could drive a lot more innovation because there's so many more Windows PCs. Unfortunately, the race to the bottom means margins are two thin to add an extra $10 port, for example.


I'm most impressed that they can manufacture 15 million pixels on a single panel without a single dead pixel. They must be wasting/repurposing a ton of panel square footage when cutting.


What makes you think they don't have dead pixels? Hasn't Apple had a policy of not replacing panels with dead pixels unless they fell in Class III or Class IV of ISO 13406-2? (http://www.tested.com/tech/1337-we-uncover-the-dead-pixel-po...)


So you mean people buy a Mac that has dead pixels and Apple tells them to go to hell? They have a no-questions asked 16 days to return it or a bit longer than that, but I don't think they'd question a return on a already broken one. You're thinking of people trying to get a warranty repair after a long time of usage. The OP was referring to the fact that when you produce LCD displays you have limitations on panel size because you get defects ever so often so you have to throw away some produced panels because of them.


Right - but I'm questioning whether Apple (or any monitor vendor) would throw away a panel that was ISO 13406-2 Class I levels. I.E. a few dead pixels, depending on their nature, and where they are, don't mean you throw away a panel.

Perhaps what happens is they just get resold as Non-Apple Displays. So, the no dead pixel displays go into the premium Apple monitors, but the Class I devices get resold in the white-label market. (This is where you see these great deals on eBay).


I know it's hard to swallow, but it becomes less and less a problem the higher resolution the screen and the smaller the pixel.

I have one on my 15" rMBP, and I do not notice it unless I put my face within 6" of the screen.


I hope you have seen the below from jcheng: I literally just came from my local Apple Store where I got my Retina MBP serviced for two clusters of hot pixels. The Genius had to look up the policy to see if it was covered, and came back saying that Retina MBP screens will be replaced for even a single defective pixel. We were both surprised.


Wow. That is excellent to know. I'll give it a shot!


Would you actually notice a single dead subpixel though at that resolution?


It's not dead pixels that you notice; I have a hot pixel on my Retina MBP, and it's very noticeable. But Apple doesn't warranty a single hot/dead pixel.


I literally just came from my local Apple Store where I got my Retina MBP serviced for two clusters of hot pixels. The Genius had to look up the policy to see if it was covered, and came back saying that Retina MBP screens will be replaced for even a single defective pixel. We were both surprised.


Where are you? Apple's warranty policies often vary from region-to-region.


Oooh, I might have to look into that, thanks. I have 3 or 4 and am still within AppleCare time :-)


If you can get to the panel itself hot pixels can often be massaged out. really.


I had an issue with a single dead pixel on a 3 month of rMBP earlier this year and Apple replaced it (the whole machine!) without any fuss. The Apple tech guy said (and this is pretty much the exact quote) "the rMBP is all about the screen, a dead pixel is not acceptable". Say what you will about Apple but their support is the best I have ever experienced. If I were you I would book an appointment at an Apple Store and get it fixed. I doubt you will have any problems. Backup everything first and be prepared to do a wipe and get a new machine if that isn't too inconvenient.


As others have said Apple will absolutely replace the screen for any dead pixels. I had mine first crappy LG screen replaced because of two dead pixels and they told me they'd do it if it was only one.


When looking closely and displaying a solid color, probably. In practice, maybe not.

Once you notice where a dead subpixel is though, your eye goes straight to it every time it's visible.


There are 14.7 million pixels on the iMac's screen. The definition of a retina display is you can't discern individual pixels at normal viewing distances, so unless you're sitting with your nose against the screen or are using a loupe, you won't be able to see a single dead pixel.


> The definition of a retina display is you can't discern individual pixels at normal viewing distances, so unless you're sitting with your nose against the screen or are using a loupe, you won't be able to see a single dead pixel.

Your conclusion does not follow from your premise.

Just because we can't identify individual pixels does not mean a single pixel has no influence. It shapes the overall picture.

If a single dead pixel is not visible at all, why have that pixel there in the first place -- working or not?

I just tested with my iPhone 6 (a 1334x750 px white picture with a single black pixel in the middle of the picture), and the single black pixel is definitely visible. A screenshot of the iPhone showing the picture in question confirms it's a single pixel on the screen.


>The definition of a retina display is you can't discern individual pixels at normal viewing distances

I wonder if this is actually true, the whole "more pixels than photosensitive cells" thing sounds a bit shaky.


Yeah, it's nonsense. There are 90-120 million rods and 5-7 million cones in the human retina.


For one it was never about "more pixels than photosensitive cells".

Second, the number of rods and cones has nothing to do with it.

The "retina" argument was about having smaller than discernible angular pixel sizes, which is true for the majority of people and normal viewing distances.


Put a pattern up. 1 pixel by 1 pixel, white, black, white. You'll see just fine. It won't be "grey".


That's true on conventional panels, but the whole point of retina is that the pixels individually are smaller than the human eye can resolve. I'd be interested to see in practice. I suppose it depends how far your eye needs to be from the screen for it you count as retina in that way.


The inability to discern an individual pixel with the same color as its neighbors doesn't mean that the same pixel with a vastly different collor won't be seen. A single bright green pixel in a white window will still be visible, even if the edges of the pixel itself can't be seen otherwise.


Yes. My rMBP had a dead pixel that stuck out like a sore thumb to me. Whenever you have a solid color on the screen you notice it.


That was my first thought as well. I wonder if Apple has limited the refresh rate to something like 30hz or if they are banking on most apps simply using the downsized resolution instead of the full 5k. I can't imagine that card trying to play any modern game at anything close to that resolution for example.


Of course not. You don't even run recent graphics-intensive games at full resolution on the highest-end MBP with 1/3rd the pixels. We're still some ways from any consumer system taking full advantage of retina displays for high-end gaming.


Are you sure about that? I have a 4K monitor and initially tested it with 3D games at full resolution and they played fine at 30Hz. I switched to an nVidia card and was able to play at 60Hz just fine.

Going to 5K only about doubles the number of pixels, so it sounds like a solvable engineering problem.

Panels are ready and GPUs are ready; the link layer is why you can't go to Best Buy and buy a 5k monitor. DisplayPort and HDMI both suck.


4K gaming is currently reserved to the most high end desktop GPUs if you want to play any recent/modern games. To get >60fps you'd need two in sli/crossfire. A M290X is not even close to that. But even on a 4K display you can easily play at 1080p because it scales down nicely, for 5K youd probably have to play at 1440p which is a bit much for the M290X


Good luck getting >60fps worth of 4k pixels over DisplayPort or HDMI.

I will give my firstborn for an 8k* monitor with a 120Hz refresh rate. He or she will probably have kids when I get that, though.

(4k is not that compelling. At 32" it's only 140ppi, which is nothing approaching "retina" levels.)


> (4k is not that compelling. At 32" it's only 140ppi, which is nothing approaching "retina" levels.)

...one would hope that you're not viewing the 32" 4K monitor at the same distance you would view your phone from. Comparing PPI of a 32" screen to a 6" one is rather meaningless as the viewing distance is going to be vastly different.


But your phone, at retina distances, does not fill your field of view, and putting your face close to a large screen does.

The ultimate watermark will be retina-level resolution for devices like the Oculus Rift, allowing you to move your eyeballs all around and see real-life-quality graphics.


Yes, 1080p in the DK2 is worse that i had imagined. But Oculus apparently demoed a device with a 1440p screen which is already a lot better from what i heard. Still 4K and above have a totally valid usecase for VR.


I sit a little bit more than an arm's length away. It's not enough resolution to turn off anti-aliasing on fonts and still have them look nice. (And I use big fonts.) Circles still look blocky.

The Chromebook Pixel has a nice resolution. That's 240ppi.


I agree, while 32" makes native 4K useable in terms of desktop real estate, 8K in retina mode (basically viewable 4K) would be the endgame for usable monitors i guess. However since we are already starting with 5K, i don't think we have to wait as long as you imagine.


> I switched to an nVidia card

Which is where you demonstrated that you are a power user. Most consumer systems don't even come with discrete graphics cards anymore.

Consumer systems took a big backwards leap when integrated graphics became the norm again. They're running several years behind the discrete cards someone like you or I might buy.


All I'm saying is that I think Apple might be able to make it work.


I have a hard time believing this, what games were you playing and what graphics cards are you using? And by 30hz/60hz do you mean your actual frame rate or the monitors refresh rate?


I play minecraft, wow, tf2, lfd2 at 4k@60hz on the D700 mac pro. I set everything to ultra and limit framerate to 60fps--it never drops below 50fps


Alright I can believe that, the mac pro is not a machine that an average user owns let alone can afford. You have two graphics cards that are most likely more powerful than a single card that an average pc gamer has.

Also those games aren't too graphically demanding tbh, l4d2 is the newest one and it's over 5 years old.


I don't think we're that far away from consumer systems being able to take advantage of high res displays. This build here (http://pcpartpicker.com/p/HZQm23) can already drive 4k on almost all games that are currently out, and it's only $1600. Large manufacturers like Dell wont be far behind with desktops sporting similar hardware. I'd expect consumer (as opposed to enthusiast) 4k gaming on the desktop to start happening mid next year.


that would require a new gpu generation while the current one that is 4K capable (albeit only really smooth in SLI) has just been refreshed.


If you mean extremely demanding games sure, the rMBP is not a gaming machine. But it plays many games quite well. Examples: Borderlands 2 @ 1920x1200 with high settings (Windows), Fallout 3 @ full res and high settings (Windows), Diablo III, Minecraft (more CPU demanding than GPU), Left 4 Dead 2 @ full res with high settings.


OSX as an OS is weaker than Windows at games performance. For example, Valve officially supports Mac & Windows in-house, and running the same game (e.g. TF2) on the same hardware in windows will see framerate increases of 20%-100%.


Why is this? Any technical reasons?


trying to play any game on the MBPr is a nightmare.


The problem is that Apple eco-system tends to be quite limiting in what we can do for gaming in other systems.

So I rather have the choice among powerful GPUs, with drivers kept up to date to the latest OpenGL versions (separately from OS versions), then having retina support.


Couldn't they do basically what is being done with current 4K monitors? I.e. treat the monitor like a dual screen using two connections?


Most 4k displays do that due to bandwidth restrictions of current display cables. There is unlikely to be any GPU benefit to treating it as two separate displays since you still have to push the same number of pixels.


Except of course that both outputs could come from their own separate GPU. Plenty of dual GPU cards.


So anyone know what the actual native res is?


5120x2880


Serious question here. At what point is the human eye unable to tell the difference between such high res displays?

I'm thinking in the same terms I used to think when I was doing home audio. My boss had a set of $10K speakers. Sure, they sounded great, but then I listened to a clients $5k speakers and couldn't tell the difference between his and my bosses speakers. It was as if my ears weren't finely tuned enough to tell.

Of course if you ask my Boss about the difference, he'd take an hour to tell all the differences. To me, the lay person at the time, I couldn't tell.



That chart should really be zoomed to about 1/5 its size... it goes up to 150 inch? Who has displays that big?

The X axis should go from 10 to 60, the Y axis should go from 1ft to 30ft, max.

As it is, it's not all that useful for figuring out what the density limits of a desktop monitor are. (Average desktop monitor sizes occupy maybe 1/12 of the X axis!)


The chart wasn't made for desktop monitors. It was made for TVs. It's from this article: http://www.cnet.com/news/why-ultra-hd-4k-tvs-are-still-stupi...


Well with a 4K display your able to sit closer to larger ones because you will not detect the pixels as easier. I know many people think you want more distance with a larger set, but with 4Ks they are just fine, as long as don't move your eyes to much to see the entire screen.


That is a very nice chart.

But gee, one length (viewing distance) is in different units than the other length (screen diagonal size). Hilarious.


I'm not entirely sure I follow why you feel this is significant. At least in the US, we tend to measure viewing distance in feet and diagonal screen size in inches, and those are both measures of length. It's not like the two axes are "feet" and "fluid ounces."


Even if it was fluid ounces, that's still not a problem - A line chart of my account balance would have Euros on the y-axis and months on the x-axis. It's still a meaningful chart though...


What does "full benefits" mean?


Here's the full article: http://carltonbale.com/1080p-does-matter/

It doesn't really say, but I would guess it has to do with when you start to see individual pixels.


Thanks!


For a normal viewing distance when using a desktop PC 4k on 24" or 5K on 27" is pretty much the sweet spot and i am pretty sure Apple won't go above that anytime soon because the benefits are diminishing.


A typical generous estimate of human visual acuity is 0.3 arcminutes. As in, any pixel density beyond that is probably not visible to us. It's not an exact measurement since there's disagreement over how to properly measure it. But regardless, most estimates are in the 0.3-0.4 range.

Then simple trig tells us that we are maxed out when:

(distance to screen) * sin(0.3 arcminutes) = pixel size

The new iMac has a 27 inch diagonal, and 5874 pixel diagonal, so its pixel size is 0.004596 inches. Plug that in above, and you can see that you would have to be 53 inches from the screen to be maxing out your visual abilities. Anything closer (like normal) and you actually could still benefit from higher resolution.


Depends on how far away the monitor is.


wow, you swallowed a lot of bullshit marketing :(

>did a custom controller for the display timing (Timing Controller (TCON))

Apple doesnt produce displays, they dont design TCONs - Tcon is a pcb sitting directly behind glass driving individual crystals, and is designed and manufactured by the same company making the display. Also every Tcon is custom, and build to drive particular type of screen.

Yes, when Dan Rissio says "we manufactured this screen' he is LYING TO YOU. They bought whole thing from LG or Sharp.

AMD doesnt support DP 1.3 yet, so only way for 5K resolution is bonding two 3K screens together, this is so innovative IBM did it 15 years ago in T220.

> retina type rendering

you mean scaling down? this is what GPUs do best, out of the box.

Yes, this screen is amazing. But dont act like its something revolutionary touched by Noodly Appendage.


Apple has had its own display engineers since the '80s. Yes, they use several off the shelf components, but they're also known to create their own chips and oversee specialized manufacturing for particular models/challenges. Case in point was the 21" Apple CRT from the late '90s that featured a Sony Trinitron but also had custom Apple chips that auto calibrated as the display aged based on Apple's ColorSync technology. This looks like a similar collaboration.

You really think the largest company in the world (by market cap) doesn't actively collaborate with its source companies and do custom designs created by its internal engineers together with the supplier's engineers?


None of which addresses the points raised per parent? And yes we all know Apple has a big market cap and lots of engineers...


The points raised by the parent don't need to be addressed because they are just innuendo.


Then why respond to parent at all if you're not actually responding?


Because the response reveals the innuendo. There is no need to credit it with some kind of point by point analysis.


> Yes, when Dan Rissio says "we manufactured this screen' he is LYING TO YOU. They bought whole thing from LG or Sharp.

It's not a lie if "we" refers to Apple together with its partners or "manufacture this screen" refers to the combination of LG/Sharp's screen with Apple technology.

Is it a lie when Google says it created Android without mentioning the creators of Linux or Java? Is it a lie if Intel says it manufactures a CPU without mentioning the manufacturers of the silicon wafers?

Dude, it's marketing, which I'm not a fan of. But be fair.


Okay nobody seriously looks at Apple as a panel manufacturer, unless you (and Apple) are referring to the fact that the company has a stake in Sharp. The plant and expertise required to build panels is clean room stuff that AAPL just does not do. It does industrial design and software. Not "manufacturing of screens" so the point is clear. Android's skin is distant enough from Java and Linux that it can credibly be called a creation of Google. Just as Audi can say it built the car without having to credit the inventor of the internal combustion engine, and instructing a supplier to build a part to its specifications is hardly "manufacturing" it. It is "specifying" it. As every Apple product owner knows: "Designed in California, MADE in China". In this case, made in Japan/korea. As such, "we manufacture" is skirting close to the edge of a lie if "we" is spoken at an Apple branded press conference with no mention whatsoever of partners.


As someone who has worked in Apple's hardware engineering for years, you're pretty much 100% wrong in everything you've said.


Yeah, they do in fact make TCONs.


They didn't manufacture it, they designed it. How many tech companies design and manufacture all of their own stuff? Any? They didn't "buy the whole thing from LG or Sharp" they designed and developed it and then used them as manufacturing partner to produce it.

You have really swallowed the Apple hateraide it seems.


Yes, just like they designed Ipad Retina screen, except its designed and manufactured by Sharp.


auto-correlated downvote trend on a perfectly reasonable point: AAPL is not very innovative as Dell is launching the same monitor. Sure the driver logic might have been the subject of some consulting but if anybody thinks it will not appear within weeks on competitor brands they're kidding themselves. Everything about this display is not proprietary AAPL tech.


The Dell monitor is apparently four 2560x1440 panels (a fairly common approach to early-stage 4k monitors/tvs, as well. Anandtech were told that the iMac is controlled as a single unit, so it's probably rather different to the Dell, and yes, they probably _do_ have a proprietary controller.


As is discussed above, though, even if such displays appear, who will have anything to plug them into? Yes, some people will be able to go find a specific kind of video card to plug into their homebuilt rig to drive it, but who will be offering something like this that normies (95%+ of the market) will comfortably and confidently be able to just go and buy? That counts for something.


Any card with two outputs will do, just like IBM T220 from 2000y.


Perfectly reasonable speculation/opinions but there's no technical details available of the internals of Dell's upcoming display that uses the same|similar panel or the iMac Retina at this point. You might be getting down voted for your reasonable speculation/opinions because you presented it as fact. Online discussions can be of higher quality when everyone agrees on the difference between fact and opinion.


Don't be so condescending.


FWIW, the only other 5k monitor I can find is from Dell. It's also $2500.[1]

In other words, buy a Retina Cinema Display, get the computer to power it for free.

[1]: http://www.maximumpc.com/dells_5k_monitor_pre_reviewed_2014


The 5k display would be more versatile, however. Well, I guess that’s relative to …

I can understand why Apple is not in a hurry to sell this just as a display. This display could likely not be driven at all by all of their most popular Macs: all the MacBooks and the Mac mini. Maybe it would work with the Mac Pro, but that’s it. And that Mac Pro is probably not the best selling Mac, by a large margin. So they would basically sell this display that only works with a Mac Pro.

I can understand how that might seem like a wasted effort until at least a couple more widely selling Macs support the display. Maybe when the MacBook Pros can do it they will start selling just the display.

With the iMac the advantage is that Apple is delivering it with a computer that definitely can push those pixels, so you dodge all compatibility woes that will plague these high-res displays for a couple more years. It seems like the still had to get in there and do some weird custom stuff and maybe that’s just hard or impossible to do if you have to do it in a display that is hooked up to some random GPU from Apple’s past Macs.


Prior 27" iMacs have had the ability to run as external displays in Target Display Mode.

This was very nice when the view connection was DVI, but the current ThunderBolt connections limit it to other systems with ThunderBolt and the appropriate drivers. This limits it to pretty much using the display with other Macs.

My quick review of the Apple's tech spec for the new iMac doesn't say if it has this feature.


https://twitter.com/panzer/status/522840005677293568

"The new Retina iMac does not work as an external display."


I had the worst time getting Target Display Mode to function sanely. I had thought that it was some kind of hidden systems feature or something, but at least at the time it required an Apple keyboard be attached and a specific key sequence be pressed during startup to make it happen.


On my mid 2011 iMac ⌘ + F2 enables target display mode once something is plugged in to the mini display port. Nothing needed at startup.

Not sure about having to be an Apple keyboard- probably works with any keyboard as long as you know which key is mapped to command. (Usually the "OS" key).


FWIW, Apple's support for Target Display Mode has not been updated to exclude the new Retina iMac: http://support.apple.com/kb/ht3924#4


Thunderbolt is backwards compatible with DisplayPort.

DP1.2 however only supports 4k resolutions.


You can use a Thunderbolt computer (which are rare) with a DisplayPort display, but you cannot use a DisplayPort computer with a Thunderbolt display!


While you can use a DisplayPort display with a Thunderbolt iMacs, you cannot use target display mode on a Thunderbolt iMac with a DisplayPort 1.2 PC (or pre-thunderbolt mac).


Can you buy the monitor separately? It looks like a dream display to me, assuming that it's glossy which is usual for Apple. It's very difficult to find glossy monitors these days. I just can't go back to matter after getting spoiled with glossy.


Honest question: Why do you prefer a reflective surface over a matte on a screen? In my experience it only reflects the room and makes the content harder to see.


You get more saturation on glossy displays. Matte surfaces work by diffusing the light that is incident on the screen but that diffuser also attenuates the light coming out of the display. Less light, less contrast, and less saturation.

An artist friend of mine has what is essentially a white cotton muslin curtain that goes around their work area which keeps reflections down to near zero (without being in the dark) and only uses glossy displays. It would drive me nuts but I see the attraction.


Glossy requires you to exercise better control of room lighting, but the matte coating causes white backgrounds to sparkle and text to be hazy. Especially on a retina-level display where the text is otherwise so sharp. I used a 2414Q in a Microsoft store, and text looks nowhere near as sharp as it does on my Macbook Pro Retina, despite similar pixel density.


Microsoft uses different text rendering algorithms than apple. You should compare a picture with text generated on OS X, not text as seen in applications.


In my experience, colors just look better on glossy displays. Since glossy screens are ubiquitous on phones and tablets, I would figure most people are used to the reflection issue by now. Personally, I don't notice reflections unless the screen is off, but it is obviously going to depend on your room.


I have a different question: why hasn't Apple been using optical anti-reflective coatings on all their non-touch glossy displays -- especially the Thunderbolt display? And: is this 5K iMac AR coated?

AR coatings on touch screens are problematic because skin oils make visible marks, but on non-touch glossy screens they should be standard. Indeed, back in the CRT era, it was routine for high-end CRTs to be coated.

I have noticed that Apple is using a coated screen for at least one model of MacBook Air, but for some reason they don't seem to have fully embraced coated screens.


I've used a Mac with a glass screen since 2011 and it really hasnt given me any issues. Sometimes at the office I may have a issues with glare or if I sit with a window behind me. If I adjust the angle of my monitor the issue is immediately resolved.

I was really against buying the glossy screen but the matte looks washed out compared to the glossy. If your display is around 60% or above brightness then you really dont have a glare issue.


If you can control the light, a glossy display will have minimal glare and none of the downsides of a matte display.


Of course, if you work in an open-plan office -- and let's face it, despite the evidence that it's an unforgivably stupid way to set things up, most of us do -- you can't control the light.

And if you're using a laptop outside of a home/personal office environment (which laptops are presumably built for), you don't have control over lighting either.

For tablets and mobile devices which generally aren't used for working at long stretches, I guess I can see the appeal. But it's really strange to me that glossy is popular on machines people do use for long-stretch work -- and often in high-glare overhead lighting open-plan offices.

(And not only popular, but increasingly the only available option.)


The color and contrast is way better than any matte screen I've seen. Reflections are not a big problem for me, but of course this depends on your computer setup and where your windows are located etc.

The annoying thing is that it's nearly impossible to find a glossy screen on the market.


Different strokes for different folks, but I find it funny that you think your preference is the one the market is ignoring lately. Every display Apple makes is glossy only, as is nearly every TV these days. I really wish I could get a matte rMBP and a matte 60" HDTV, but they just don't make them anymore...


I don't see how you can reach that conclusion. I want to buy a glossy monitor, not a glossy television screen. The only well known option is a monitor manufactured by Apple, but they're really expensive compared to typical quality matte screens. It's really hard to find even a single alternative. I found a single HP monitor which maybe has a glossy screen, but they don't state that clearly.


I agree that it is harder to see, especially if you have a backlight. If you have a well-lit room, with light coming from the top in a non-reflective angle, the display is vastly better than the matte one. (Source: I have both Apple Thunderbolt Display and old-school Apple Cinema Display with matte finish)


Not yet it seems. They haven't released an update to their Thunderbolt Display in years (since 2011), but it should be right around the corner with the release of this 5K iMac.


They would have announced it today if that was the case. Currently there isn't a display transport capable of pushing that many pixels. DispayPort 1.2 can't do it on a single port and that's what Apple has MacBooks and Mac Pro.


Don’t both MacBook Pro and Mac Pro have multiple Thunderbolt ports? Don’t those work as separate DisplayPorts? At least on the Mac Pro?


Yes, and that's already being used by the first batch of 4K monitors and is called MST. This models the monitor as 2 lower-resolution displays and drives them with two separate signals.

It has a lot of compatibility problems, and is hacky as hell. I doubt Apple would adopt this as an official/first-class solution to anything.


Exactly. They can get away with it in the iMac 5K because it's all baked into one tight enclosure.


> It has a lot of compatibility problems

Solving "compatibility problems" is what Apple do best by requiring you have an Apple Mac Pro and an Apple Cinema Display and not supporting any other configuration


they only have 1 bus


The Mac Pro has two buses, but the Macbook Pro has only one – so the fact that there are two Thunderbolt ports doesn't really help you in this case.


It depends on the application. For games, fotos, and videos glossy may be fine. But if you work a lot with text (as developer etc.) then matte displays are far better for the eyes.

When I made an exception and bought a notebook with glossy display I regretted that a short time later. Glossy display? Never again!


> In other words, buy a Retina Cinema Display, get the computer to power it for free.

Funny thing, it was also the case for the first 27" imac


I think it's a supply issue. Once that's worked out, I'm sure they will release just the 5k monitor. I don't think they like telling customers who ask for a 5k monitor to go buy dell for too long.


I can't imagine it'll stay that way for long. 4k displays are already down to $300-500 at Amazon.


Shitty 4k displays have been cheap for a while. It's not a reasonable comparison. (My no-name brand Korean 27" IPS display was great until it frizzed out after 9 months.)


Dell's 24" UP2414Q is an IPS 4K monitor with 99% AdobeRGB and 100% sRGB coverage, and it has a 185 PPI panel that I'd argue is already in the "retina" range. Although with its ~€555 price it's not exactly in the $300-$500 range, but it's a close call, so I wouldn't call all 4K monitors around this price range "shitty" at all.

Update: it works in 60Hz @ 4K.


It was twice as expensive at launch, though. I suspect the new Dell model will follow suit.


You can buy a good 24" display for a lot less than a similar 27" display. That's a whole different discussion.


Sure, but we weren't discussing screen sizes, we were discussing 4K monitors in a certain price range.


You could buy quite a few shitty 4k displays for the price of one of these, though. Enough to get one that escapes the bathtub curve, certainly.


Typically 30hz ones, so not suitable for general computer use. 60hz still tend to be fairly expensive.


IPS at 60fps?


Yes, via DisplayPort 1.2.


What I mean is he knows of high quality IPS 4K displays that can do 60fps for $500?


I'm normally firmly in the "I hate websites that mess with the normal scrolling of a page" crowd.

The scrolling transition at the top of this page really impressed me.


Interestingly done - the huge image is broken up into 15 separate canvas elements and then the containing DIV is scaled with CSS transforms as you scroll. Not immensely clever, but a solution which performs pretty well in Chrome on my Mac, at least.


Zoom out the whole page to 33% shows an interesting perspective on the design.

Monitors keep getting wider, web pages keep getting narrower and longer :-)


This is because your eyes never get wider. So it makes sense that the content would never get wider either. I feel like web apps are the only real use case (aside from experimentations) where using a fluid-width (the content grows with the screen) is appropriate.


I agree, this is the first one I've seen that didn't annoy me, including some of Apple's past efforts (e.g. the Mac Pro page).


You'll probably like this then: http://acko.net/


You were correct. That is really fun to play with.


It actually has meaning, unlike most scroll captures.

I also like how (about 3/4 down the page) they capture the background iMac image on the left and keep it "sticky" during scroll, but not completely stationary -- it still responds and gives slight feedback, which makes it 99% less annoying. Nice.


Very impressed too, pretty stunning design. Really memorable.


I was curious what the exact resolutions were. A quick google search claims:

"4k monitor": 3824 x 2160

"5k monitor": 5120 x 2880

That's a lot of pixels.

It might be a good idea to be skeptical about spending >$1,500 on a 27-inch monitor in Q4 2014. It's difficult to notice any pixelation on a 27" screen at a resolution of 2560x1440, so clearly the reason to upgrade to 5120x2880 is for the extra screen workspace. But unless you have very good vision, you're probably not going to be able to read text at 5120x2880 without zooming. What's the advantage?

For $1,000 you can buy two 27" 2560x1440 monitors, which is a huge amount of workspace. Also, a single $300 midrange GPU can drive both monitors at full resolution. A couple years ago, that was cutting-edge tech, but it cost ~$2600. Also, two monitors offer a better user experience than one monitor, since window management is a bit easier.

Would anyone mind explaining whether the pros of a 5k monitor outweigh the hefty pricetag?


> It's difficult to notice any pixelation on a 27" screen at a resolution of 2560x1440

Not once you are used to a retina display. I have both a retina Macbook Pro, and a 27" iMac. It is actually really obvious, especially when looking at text.

> you're probably not going to be able to read text at 5120x2880 without zooming

This is correct, unless you use the standard retina resolution which is like 2560x1440 except doubled, so everything is the same size just much nicer looking.


I would say the pros are not there just yet; it would certainly look nicer, but not a whole lot nicer. Moving from 1366x768 on laptops to 2560x1560 (or whatever it is, can't remember off the top of my head) was a great shift. This isn't quite as big a deal.

It would be very nice to be able to display full 4K detail or side-by-side 1080p videos for editing purposes. Photo editing will be great too, though you will have to resort to 'physical zoom' (i.e. moving your head closer) to do serious pixel peeping, rather than the old and probably superior technique of blowing up the pixels to larger than you'd see them normally.

For gaming and such.... I'm not really sure. Downsampling is now the name of the game, i.e. rendering above the display rez and dropping it down, which has nice effects on IQ. Rendering 14.7 million pixels is a hell of a task to start with, and then you start getting all manner of masks, per-pixel effects, etc... the demands really multiply. Plus there's also a focus on high framerates, like 120, which we're not going to see at 4K or 5K for a gooood long while. It'll be years before things catch up at the mid-tier level of processors and GPUs. And 8 GB of RAM, I hardly need add, is entirely insufficient.

tl;dr: for today's flat desktop purposes, probably not necessary but could be nice. for gaming and other purposes, probably reaching too far. but it's still a nice trend.


For $450 you can buy a 28" 4k monitor: http://accessories.us.dell.com/sna/productdetail.aspx?c=us&c...

Although I'm not sure how the display quality compares, that'll be for e.g. Anandtech to test. And other 5k monitors are in the same price range as the imac (except without the computer part)


>It's difficult to notice any pixelation on a 27" screen at a resolution of 2560x1440, so clearly the reason to upgrade to 5120x2880 is for the extra screen workspace.

On the new 5K iMac you won't perceive a bigger workspace^ than you had on the previous (2560x1440) model, because all UI elements will be scaled up at the factor of two, practically making everything look twice as sharp.

^: Actually, just like with the retina MacBooks, you'll have the option to use alternative resolutions up to 5K with and without scaling, effectively creating a larger virtual workspace.


"It's difficult to notice any pixelation on a 27" screen at a resolution of 2560x1440" - wouldn't say that's true - they screens are good, but when you switch between any phone, any tablet, a rMBP and the current iMacs you really do notice the low resolution.

I never had this problem before getting the rMBP however, so maybe I'm just getting picky!


It might be difficult for you, but it's very clear to me.

But if all you are doing is text editing, you may be right about the value.

On the other hand if you are doing 4K video editing or designing mobile apps, this is a great tool.


I would think for graphic design work the extra resolution would be well worth the extra price.


This might be off-topic, but I instantly recognized the waterfall image at http://www.apple.com/imac-with-retina/ as being http://en.wikipedia.org/wiki/Sk%C3%B3gafoss / https://www.google.com/search?q=Skógafoss&tbm=isch :)


Ah well spotted :) Icelands waterfalls seem very popular. "Dettifoss" is another featured in the opening scenes of Prometheus.


OS X 10.11, Skógafoss! :-)


Hopefully they'll release an updated 5K version of the Thunderbolt Display. They haven't updated it in years, I was really hoping for it today.


As I mentioned in the other thread, there's just not enough bandwidth to drive at 60fps over a single bus / single cable with DisplayPort 1.2 without compression (DP v1.2 gives about 17.3Gbps usable bandwidth which is just not enough). The iMac allows them to bypass that bottleneck by interfacing directly with the graphics card.

My predication is that once we see Apple deploy Thunderbolt 3 / DisplayPort 1.3 (capable of 25.9Gbps usable bandwidth [1]), we'll see the Apple Cinema Display Retina.

[1] http://www.extremetech.com/electronics/190130-displayport-1-...


That's why many people were expecting the iMac to come first. Since it's an integrated machine the panel can be connected directly to the graphics card via LVDS (or whatever the current version is) instead of the limitations of the external cabling the Mac Pro or (for an external monitor) MacBook Pro have.


Thunderbolt and display port 1.2 can't oush that many pixels. Need to wait for display port 1.3


Can their current Thunderbolt connections drive a display of this resolution?


Not 60fps over a single cable / single bus without compression.

DisplayPort v1.2 can carry around 17.3Gbps (this is minus overhead), so you would need DP v1.3 with an effective rate of 26Gbps to support native at 5k without compression [1].

[1] http://www.extremetech.com/electronics/190130-displayport-1-...


What about dual cables in MST?


This seems like a rather large oversight.


They have to update their notebooks with Thunderbolt 3 before they can release 5K Thunderbolt Display. Thunderbolt 2 doesn't have enough bandwidth to push the pixels.


It's a technical limitation, not an oversight. No current Mac can output that much data over a single port.


With the amount of custom chips they needed for the display, it wouldn't surprise me if there are issues getting it to work as an external display that they haven't solved yet.


I think they might be waiting for a bigger refresh of the Mac Mini to power it


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: