

The 5K Retina iMac’s screen runs at 60Hz at 5K resolution - lelf
http://arstechnica.com/apple/2014/11/yes-the-5k-retina-imacs-screen-runs-at-60hz-at-5k-resolution/

======
shurcooL
I'm very glad to see arstechnica using the blurbusters tool for testing
monitor rate!

It was initially a Win32 exe that I created 5 years ago [1] with a simple idea
to display very distinct frames, back when people were using all kinds of
silly less accurate methods like timers to do these things. Blurbusters did a
great job of making the tool more accessible to more people (visit website vs.
download and run an unknown exe). They were even respectful enough to ask for
my permission before adding it their website (which I of course was happy to
give permission to, and not that I would mind too much if they didn't, it's a
very simple idea after all).

[1]
[http://hardforum.com/showthread.php?t=1423433](http://hardforum.com/showthread.php?t=1423433)

~~~
jmpeax
I don't quite understand how it works. My guess is that it doesn't actually
perform any test itself, you're the one that is deciding whether the refresh
rate is actually 60Hz. The big green "VALID" doesn't say you're monitor is at
60Hz, but that it is drawing at 60Hz to the frame buffer, and you can now
observe the motions to decide for yourself whether it is smooth enough that
the refresh is running at 60Hz?

~~~
ojbyrne
The comment at the bottom adds the key piece of information - you need a
camera that you can set the shutter speed to 1/60th of a second.

------
Sephr
> Even when using SwitchResX to force the display out of HiDPI mode and into a
> non-scaled 1:1 5120x2880 resolution, SwitchResX continues to show 60Hz (I’d
> include a screenshot, but it looks identical to the previous one).

So they used Blur Busters multiple times with multiple display configurations
and not a single time did they actually think to read any of the bright red
instructions.

This is not what I've come to expect from Ars Technica.

~~~
kenferry
Where are the bright red instructions? I'm interested in this, but I'm having
trouble finding an explanation of how to use
[http://www.testufo.com](http://www.testufo.com), or of what it's measuring.

~~~
tyilo
See this for the specific test used in the article:
[http://www.testufo.com/#test=frameskipping](http://www.testufo.com/#test=frameskipping)

The instruction is on the top

~~~
kenferry
Thanks!

------
staunch
The test used:

[http://www.testufo.com/](http://www.testufo.com/)

Created by the person running Blur Busters, who is doing really neat stuff.

[http://www.blurbusters.com/](http://www.blurbusters.com/)

The forums have some fascinating discussions:

[http://forums.blurbusters.com/](http://forums.blurbusters.com/)

I think a lot of other HN'ers would like it.

------
jrockway
Yes. The reason why laptops and all-in-ones get the panels with high
resolutions is because the manufacturer gets to build the source, sink, and
transport layer, and isn't stuck with what a standards committee can make for
them. (Details here:
[https://news.ycombinator.com/item?id=8549629](https://news.ycombinator.com/item?id=8549629))

But hey, at least the standards committees are working on new forms of DRM so
the NSA can't tap your video cable and see your screen. Or something.

~~~
webkike
Lost me at the end, but good point in general, I think?

~~~
potatolicious
I think he's referring to HDCP, which is a standard meant to prevent tapping
into a HD video output for the purposes of recording.

It's of pretty dubious effectiveness, and quite honestly has screwed me in the
past more than a few times even when I was doing something 100% legitimate
(like renting a movie on iTunes and trying to use my MacBook's HDMI out to the
TV... and it refusing to play).

Not to mention there are many, many more ways to record HD content from a
source than tapping the video output.

~~~
revelation
Well, it's certainly of no effectiveness whatsoever since the HDCP master key
leaked and anyone can trivially decrypt a HDCP stream ever since. The NeTV
([http://www.kosagi.com/w/index.php?title=NeTV_Main_Page](http://www.kosagi.com/w/index.php?title=NeTV_Main_Page))
does this in realtime.

That doesn't mean they have stopped preventing people from watching their just
bought content on a beamer or display without HDCP support. Oh no, that stuff
continues right now.

In the history of DRM, this is probably one of the most bizarre failures. You
can reasonably assume it never stopped anyone from making illegit copies
(capturing very high bandwidth interfaces like HDMI is decidedly non-trivial
and simply not worth the time investment if there are much easier sources),
while denying people who just seconds ago shelled out cash for your product
access for a reason they will not understand and will certainly not
appreciate.

~~~
beagle3
> The NeTV does this in realtime.

No, the NeTV does NOT do any kind of decryption. What it does do is encrypt
its own image using the same key in parallel, so that it can overlay its own
display on top of the incoming display stream.

It does include an implementatiom of HDCP that could be used for decryptioon
if you work hard enough at it (and I would guess someone worked hard enough at
it), but as it comes out of the box, NeTV cannot be used to strip off HDCP.

~~~
lelandbatey
Is there anything that _does_ decrypt HDCP in real time? Something where you
plug an encrypted cable into one end and it outputs an unencrypted source on
the other? One that maintains things like audio?

~~~
1offi
Sure: all SiI9187 based HDMI splitters do, e.g. amazon.com/dp/B005HXFARS

~~~
jrockway
The amazon pages say they don't decrypt anymore.

That's a fragile attack anyway, as I imagine the key they're using would be
revoked. The only attack that works is one that uses the master key to
generate keys that haven't been revoked, which I doubt their hardware does.
(Illegal and all that. Just being an HDCP endpoint is easy; take the chips
from your TV product and stick them in the splitter. "Oops, sorry." But then
you get revoked.)

------
pdknsk
I don't know why this is news. Did anyone really think Apple would release a
display at 30Hz? I very much doubt it. The real question is what PWM frequency
the backlight runs at! (If any.)

~~~
r0s
Most 4k screens on the market have limited refresh rates compared to lower
res.

That's why it's news.

~~~
pdknsk
You're mistaken. There probably isn't a single 4K monitor that doesn't support
60Hz. Unfortunately some people buy 4K TVs like the infamous Seiki to use as
monitors.

~~~
lreeves
The one I've seen talked about the most is the Dell 28 Ultra HD 4K (1) and it
is indeed 30HZ.

1 -
[http://accessories.us.dell.com/sna/productdetail.aspx?c=us&c...](http://accessories.us.dell.com/sna/productdetail.aspx?c=us&cs=19&l=en&sku=210-ACHO).

~~~
pdknsk
Wow, you're right. That's a terrible monitor with TN panel to match. From a
reputable(?) brand no less. Anyway, I think my original point still stands.
Apple is no Dell.

~~~
FireBeyond
So that's a budget 4K monitor. What did you expect for just over $400?

Dell makes plenty of high end monitors, they just don't give them away at
loss-generating prices.

The 3008WFP-HC (not 4K) is a 30", 2560x1600 monitor, 30 bit, S-IPS panel.

I'm not sure why anyone is surprised that a $400 4K monitor is not running at
60Hz. It's not like Apple would offer that? You're right. But their solution
would also not be anywhere near the same.

Hell, they haven't even updated the Thunderbolt Display. Even today, it
requires a MagSafe to MagSafe Adapter to work with ANY Macbook currently on
sale (seriously, Apple? You can't replace the connector with MagSafe 2?) and
doesn't support USB3.

~~~
robin_reala
Recent Thunderbolt displays at least ship with the adaptor in the box. If
you’re going to be compatible with both and you’ve already engineered a
Magsafe 1 connector and a Magsafe 2 adaptor that’s a reasonable compromise.

~~~
FireBeyond
Eh. If you're manufacturing units now (which they are - my TB Display has a
March 2014 date of manufacture, perhaps the "reasonable compromise" should be
to ship with the connector that has been standard for over two years now and
an adapter for the old standard.

------
joemaller1
We got several of these at work and the screens are just phenomenal. I want
one at home too.

~~~
kmfrk
It's really quite incredible. I also just got a new - plain, boring - Dell
monitor, and the background lighting (or whatever it is) really makes the
difference between that and older, dimmer screens.

------
psp
That frame rate analysis trick by taking a photo with phone camera is just
amazing!

------
nrzuk
Nice to see an independent review to back this up! Keep meaning to nip down to
the Apple store and have a look but a little concerned I'll be coming home
with one!

But until we have DP1.3 and external matching 5K thunderbolt display this
really isn't for me. The authors comments regarding the previous thunderbolt
display "However, the Retina display does make things on the other 2560x1440
displays look… a little grody." really do put me off.

~~~
cerberusss
> Keep meaning to nip down to the Apple store

If you're due for an upgrade anyway, then I skip the Apple store. Instead, I'd
simply order it and get that "OMG" moment when you unpack it :-)

------
danielweber
How has burn-in been the past 2 years? I've got a two-year-old MacBook and I
can read my mail after I log out.

~~~
arrrg
That’s image retention, not burn in. It’s not permanent, though still very
annoying.

Do you have the 2012 15" Retina MacBook Pro? Some of those (those with LG
screens) were suffering horribly from image retention. I got the screen
exchanged twice on mine. Now I luckily have the Samsung screen.

There are currently no known cases of image retention with this iMac. It looks
as though they got the issue under control and this was really a first gen
(large screen) retina tech problem. (Retina iPads were or maybe still are
suffering from image retention, too, though, strangely, with my iPad 3 I care
much less about that, for whatever reason. It’s also a bit more mild.)

------
mmphosis
Uses SwitchResX to get the iMac to actually display in 5K?

~~~
function_seven
No, they used it to turn HiDPI mode off, so that each physical pixel is
addressed as a logical pixel, instead of the normal 4 phyical to 1 logical
mapping.

