
HDMI 2.0 officially announced - nreece
http://www.engadget.com/2013/09/04/hdmi-2-0-official-4k-60fps-32-channel-audio/
======
rootbear
This race for more pixels is misguided. The change I most want to see is
deeper pixels, at least 10bpp, preferably more. I'm getting really tired of
all the banding I see on what should be smooth gradient images. If a landscape
aspect image is P pixels wide then to display a gray scale gradient, black to
white, you need

ceil(log2(P))

bits per pixel. So a "2K" display needs 11 bits, "4K" needs 12, etc. Then each
column of pixels gets a distinct value and there is no banding.

So sure, give me a 4K screen. I've seen them and they are sweet, but you MUST
also increase the pixel depth at the same time, or there will be artifacts.

And don't get me started on the horrid compression artifacts on basic cable. A
crime against quality imagery.

~~~
Daiz
Dithering can help a lot when it comes to displaying large smooth gradients in
lower bit depths. Obviously it has its downsides, though - for one, you need
to actually implement it where smooth gradients are used, and at when it comes
to video games this basically never happens and you end up with notable
banding instead.

Dithering in the field of video is pretty common, though. But it has a pretty
large problem there as well - since dithering is essentially noise, it
requires a lot of bitrate to compress efficiently, and if you don't have
bitrate to throw at your source, you're most likely going to kill it and just
end up introducing banding. Blu-ray is pretty much the main avenue where you
have enough bitrate to spare for proper dithering in 8-bit video. Anything
less than that, though... well, let's just say that House of Cards on Netflix
was suffering from banding a lot.

Banding is actually one of the biggest reasons why anime fansubs these days
are generally encoded in 10-bit H.264. Anime tends to have a lot of large and
smooth color surfaces and banding was pretty much the hardest thing to avoid
with regular 8-bit video - 10-bit on the other hands makes it an almost total
non-issue. And for non-10bit displays, it moves the necessary dithering to the
playback end, which is obviously a much nicer alternative since you don't have
to compress any of that in the video itself. And beyond the gradients, 10-bit
H.264 actually gives you better compression quality in general, which just
makes it even better.

Now, it obviously comes with the downside of not being supported by hardware
decoders anywhere, so you basically will need a decent CPU to decode 10-bit
video. For fansubs and the people who make them this isn't that much of an
issue though, since the advanced subtitles they use are also generally poorly-
supported by hardware players, and this has been the case for a long time.

Next-generation video formats may actually bring higher bit depths to hardware
decoders as well, though - H.265/HEVC has a Main 10 Profile intended for
consumer applications.

~~~
Annnnnnon
>wasting time on HN instead of finishing Chuu2 Sasuga Daiz.

------
kayoone
Honest question, does 4K and up really make sense for private use ? I am
pretty sure from the distance that you usually watch 1080p content, you wont
see a difference to 4K content at all.

Of course for big presentations and conference setups i see the usecase, just
not for the consumer mainstream. 4K would also set back game console
performance by a huge margin, if they will ever adopt it.

So i am not really convinced that the mainstream will go up to 4K, 8K etc, as
it also means that all production costs increase and the benefit for the
consumer is marginal.

~~~
papercrane
> Honest question, does 4K and up really make sense for private use ?

For big monitors I think it does. Even if 4k doesn't make sense, the increased
bandwidth is a win as the previous HDMI spec couldn't drive a 2560x1440 or
2560x1600 monitor at 60hz, and those resolutions definitely make sense to me
for computer monitors.

~~~
mbell
> as the previous HDMI spec couldn't drive a 2560x1440 or 2560x1600 monitor at
> 60hz

That has been supported by HDMI since 1.3, since around 2006.

2560×1600p75 @ 24bit is about the highest standard resolution you can pump
through HDMI 1.3.

~~~
papercrane
That's interesting, because most of the high resolution monitors I've seen
explicitly say you need to use DL-DVI to drive them at full resolution (for
example the 27" Dell Ultrasharps.)

~~~
mbell
Many companies cheap out on the supporting electronics. In particular single-
link DVI is electrically compatible with HDMI but is limited to ~1920 x 1200 @
60Hz. Basically instead of properly implementing HDMI they route a single-link
DVI transceiver through an HDMI connector and call it good.

------
rartichoke
I'm not too impressed honestly.

120 fps is the minimum for me when it comes to playing games. Then again I was
used to 120hz monitors on CRTs in the early 2000s with 0 input lag.

I would only ever consider 4k for normal desktop usage but there's no way I
would pay anything for 60 fps and we all know 4k monitors will be stupidly
expensive for at least 10 years.

~~~
sillysaurus2
It always seemed strange that monitors aim for 60fps. That seems like a recipe
for input lag. They could aim for 90fps, because when they fail to meet that,
at least they'll still achieve 60fps.

Then again, the audience of hardcore gamers who would notice that sort of
thing is probably small, so I guess it makes economic sense.

~~~
rartichoke
This is also why most LCD manufacturers use low quality TN panels. A majority
of people just can't tell the difference, don't care enough or they don't care
about things like viewing angles.

LCD manufacturers in general are really crooked. They have been accused and
found guilty of price fixing on multiple occasions.

They also love to release monitors with high quality panels initially so they
get good reviews and swap in garbage panels without letting consumers know and
then sell it for the same price as the same model.

~~~
coldpie
I'm on the market for a decent monitor. Are there any brands that you /like/,
or is it simply a matter of choosing the least terrible?

~~~
hoka
I have an HP LP2475W and a Dell U2412M; generally, among the IPS class, the
consensus seems to be that HPs have more outputs and a better UI. Some of them
are wide-gamut, however, which turns off a few purchasers. Research as to
whether or not wide-gamut is important/detrimental to you.

I'd say go for 24s or a 30; if you have the 27 you'll just wish you had the 30
:-p but definitely go IPS, they are fantastic. I haven't read much about the
Korean monitors as they were nonexistent when I researched the market 4-5
years ago.

~~~
sillysaurus2
Actually, the Dell U30xx is worse than the U27xx. The color fidelity isn't as
good. (This only matters if you calibrate your monitor.)

------
clicks
I wish it had 8K support, given Japan is actually already planning on starting
broadcast trials for 8K in roughly two years:
[http://www.futureleap.com/2013/03/japan-
plans-8k-broadcasts-...](http://www.futureleap.com/2013/03/japan-
plans-8k-broadcasts-in-2016-2-years-ahead-of-schedule/)

~~~
zach
I am glad it does not have 8K support. That would be over-engineering.

"Yeah, this hardware will cost more, but it has to support a bandwidth level
that won't be in use anywhere in the world in the next two years."

Seems totally reasonable to have a version in step with the emerging
generation, rather than future experiments.

~~~
nailer
> Seems totally reasonable to have a version in step with the emerging
> generation, rather than future experiments.

The current MBP retina display, if used on a 27 inch monitor, is more than 4K.

Conveniently, however, it's less than 8K - as is a 32 inch retina monitor.

------
Shish2k
Anyone know how this compares to displayport? I've always heard (perhaps
wrongly) of DP being technically better and patent free, so I wonder why HDMI
still gets so much more attention...

~~~
asdfs
I've found that the biggest problem with DisplayPort is that even if it is
technically superior (unfortunately I'm not qualified to know whether it is or
not), active DisplayPort/HDMI/DVI adapters are incredibly expensive.

Since HDMI and DVI came first, this limits the use of DisplayPort;
manufacturers are unwilling to create a device that requires a $100 adapter to
work with what most people own.

------
ixnu
The failure to redesign the connection form factor is a major disappointment.
Racking or repositioning a snug receiver with multiple HDMI inputs and outputs
will almost guarantee that one will bend or break. This is especially true for
low gauge runs over 50 feet.

~~~
djrogers
The obvious positive side of this is that existing cables will support the new
spec without being replaced... Says so right there in the doc.

I'd rather stick with the same form factor and cables than replace everything
at some insane cost. Not to mention the awkwardness of supporting 2 types of
inputs on a receiver/amp/switcher to maintain legacy compatibility.

~~~
ixnu
Agreed that it's a positive that existing cables can be used, but a dongle
could always be included and thereby have the best of both.

Besides, many people already use port savers (actually run savers) because
replacing a bent connector on a 75 foot run is an effective lesson.

------
zanny
Dissapointing that this means we won't see 4k 120 hz screens any time soon,
and that most consumer hardware in the next 5 years will have compatability
issues with them. I love my pixel density _and_ refresh rate!

~~~
anonymfus
Probably next DisplayPort version will support them.

------
akandiah
As someone who isn't familiar with audio technology: is there any practical
use for 32 channels of audio?

~~~
StavrosK
Having 32 speakers around you, I would imagine. It was probably a case of
"since we're standardizing, we might as well future-proof this".

~~~
johnward
accept that they future proofed sound but it doesn't look like they did that
with video

~~~
MichaelGG
I'm sure if it was trivial to offer a 100+ Gbps link and do 120Hz 8K video,
they would have.

------
martingordon
I suspect not, but I can't find any details on whether this new spec provides
for more power than the current standard so that low-power devices
(Chromecast) don't need a separate power cord.

------
delsarto
Hmm how will monster cables and the like sell their stuff when the standard
makes clear that existing cables can already handle the greatly increased
bandwidth? I look forward to the spin!

------
jevinskie
> "dynamic auto lipsync"

Isn't the audio and video _already in_ sync?

~~~
josephlord
If you connect your source through an surround processor it helps to be able
to delay the audio to match the delay of the TV which may vary depending on TV
settings (e.g. Game mode may be lower delay but with less picture processing).
I assume this spec allows the TV to inform the surround processor continuously
of the current picture delay.

~~~
nitrogen
Once upon a time I hacked something like that into TVTime
([http://tvtime.sf.net](http://tvtime.sf.net)) that constantly updated a delay
effect I'd modified on my SB Live 5.1's emu10k1 DSP.

------
vermontdevil
Anyone knows if it includes any specs about closed captions? It's a source of
frustration at times with the previous spec.

~~~
MichaelGG
I know "closed captions" are a special thing, but really, shouldn't the
rendering device (TV decoder, GPU, software, whatever) be rendering the
subtitles instead of the display device? I watch almost everything with
subtitles, and can't imagine not being able to adjust subtitles (size/depth)
and also enjoy the high-quality rendering my PC can perform.

------
pbreit
Any word about power?

------
Nux
Fsck 4k. I want holodecks!

~~~
pkroll
The closest you'll get to those for a while, is VR like the Oculus Rift, and
4K screens (tiny, phone-size ones) would help that a LOT. :)

------
fiorix
tl;dr but hopefully they've added a power supply via hdmi as well otherwise
things like chromecast will still need a usb for that :D

~~~
Terretta
> _tl;dr_

It's five sentences.

HN, as a site for stories that gratify intellectual curiosity, may not be the
right site for you.

------
Yhippa
Cool, now I can go repurchase all my cables. At least they're cheap, right?

~~~
kunil
You will need gold plated cables that have resonance crystal around them to
reduce polarisation

~~~
pkroll
You'll also need to plug them in correctly, as they're one way: plug them in
wrong and your video will play in reverse.

