
HDMI 2.1 Announced: 10Kp120, Dynamic HDR, New Color Spaces, New 48G Cable - dmmalam
http://www.anandtech.com/show/11003/hdmi-21-announced-supports-10kp120-dynamic-hdr-new-color-spaces-new-48g-cable
======
ChuckMcM
Wow that is a lot of bits. Seriously, that is a LOT of bits. Its almost like
they said, "We're sick and tired of people complaining the cable is holding
them back, here beat that suckers." :-) I guess the margins on 4K televisions
have done better than expected, certainly as a 'feature' driving upgrades they
have out performed '3D'.

For me it starts to push up against the question of whether monitor resolution
has hit the top of the s-curve. A 9600 x 5400 (10K) screen at 200PPI is 48" x
27". That is a 55" High DPI screen. And while glorious and amazing as that
would be, I'm wondering if that would be "main stream" any time soon.

~~~
jerf
I think 4K has ended up being a bit of a whiff in most respects. I just bought
a 60" TV for my living room, where we are about 7 feet from it, which is
pretty close (it's the narrow dimension in my rectangular living room), and I
can't tell much difference between putatively 4K content and clearly-1080P
(from Bluray). It's hard to be sure, because all my putatively-4K content is
streaming-based, and there aren't that many sources willing or able to ship
enough bits to clearly distinguish between 1080p and 4K.

That is, in a 1080p v. 4K streaming contest, if you dedicate the same larger
number of bits to the 1080p content as the 4K content, you'd get a better
1080p signal, too. With lossy compression, pixels aren't directly comparable,
they have a quality, and low-quality 1080p pixels v. low-quality 4K pixels
makes it hard to tell what differences are the format, and which are just the
low quality in general.

(In other news, I consider the idea that streaming is obviously superior to
disks to be a bit crazy. What disks have is _higher quality pixels_ , and I
expect that to continue to be true for a while. I've seen Bluray 1080p streams
that are clearly superior to putatively 4K over-the-internet streams. And I've
got DVDs that upscale to a higher quality than putatively 1080p streams of the
same content.)

Far more important is the increased color gamut and HDR support, which _does_
provide a noticeable increase in image quality. Even from those aforementioned
dubious-quality streaming sources; the higher-gamut HDR images are a visible
improvement over the NTSC-sunglasses we were all wearing without realizing it.

~~~
wazoox
Amen. Like about 50% of the population, I wear glasses. I absolutely can't
make any noticeable difference between 2K and 4K at normal viewing distance.
Most cinemas nowadays are 2K only, and nobody cares. Because most people older
than 40 simply can't see.

Heck, for 10 years I've seen people watching 4:3 TV on 16:9 screens, horribly
stretched. People are almost blind, seriously. They'd tolerate about anything.

HFR and HDR on the other hand, are great. I've seen HDR/HFR demos in 2K at
Amsterdam IBC and it looks much better (much more realistic) than standard 4K.

8K? The NHK makes 8K demos, and they can only show almost still images, else
you only have extremely detailed motion blur at ordinary framerates. They even
give you a magnifying glass to look at their 8K screen. That's impressive, but
I don't plan watching movies through big lenses to catch the details, anyway.
What's the point, apart from planned obsolescence?

~~~
witty_username
If the glasses are the correct one for you, you should have normal eyesight
(that's the point of glasses).

~~~
myrion
It's also almost never 100% true. Even if the glasses are perfect, they also
have to be perfectly clean or you get a bit of a blur anyway. Small enough to
not be noticeable, but strong enough to blur the benefits of such a higher
resolution.

------
jerf
"Among other things, the new standard will support the game mode variable
refresh rate (GM VRR). Nowadays, AMD already supports FreeSync-over-HDMI using
a custom mode and with the HDMI 2.1 everything gets a little easier."

Oh, that'll be nice. Now that almost every interface is using 3D rendering
tech under the hood, regardless of how it looks, I've noticed tearing has
spread from something only gamers see very much of to something I'm seeing
everywhere. Hopefully this helps drive the tech to push this everywhere and
tearing will, in perhaps 10 years or so, be a thing of the past.

~~~
digi_owl
Silly thing is that tearing is a non-issue as long as one has vsync enabled.
But games often disable vsync to get that "lovely" raw FPS number...

~~~
eropple
I don't disable VSync for FPS. I disable VSync because it reduces input lag.

~~~
matt_wulfeck
The input lag can be a serious issue with certain monitors and video cards.
It's very noticeable to me when playing and FPS.

~~~
eropple
Agreed. Overwatch in particular has felt a lot more snappy since disabling it.
I'd been out of FPSes for a while, but that game made me acutely aware of it.

~~~
ChoGGi
You should try vsync with triple buffering and make sure to set:

flip queue size / max pre-rendered frames to 1 (depending on AMD/Nvidia)

~~~
eropple
Most games I play these days are DirectX 10+, which seems to preclude the use
of the pre-rendered frames setting.

------
jedberg
A somewhat related question -- why is the industry able to agree on a single
standard for how data will move over a wire from a device to a display, but
yet can't come up with a single standard for doing it _without_ a wire.

I can take my Mac laptop or my PC laptop and roll up to my TV and with a
single cable make pictures and sound come out of the TV.

But yet neither one can talk to the TV without a cable, unless I get a special
device (which is different for both and ironically uses the same cable).

And even worse, if I had a Samsung phone, _it could_ connect to the TV
wirelessly.

Why can't we have a single wireless display standard? Is there a technology
problem I'm missing, or is it really just walled gardens?

~~~
israrkhan
Miracast is such standard. Thousands of devices support it. However two major
players ( Apple & Google) dont. Apple has traditionally been a closed system,
so it is not a big surprise. What surprises me is that Google also removed
support for Miracast from Android and started promoting their proprietary
casting protocol.

~~~
Sargos
Miracast is heavily broken on many devices and rarely works at a reasonable
level. Netflix and Google worked to create a good streaming standard names
DIAL. Google took the idea and ran with it to create the Cast system which
ended up working really well.

~~~
jedberg
But the Cast system doesn't send arbitrary bits to the TV. It's a protocol for
handing off authorized streams which are then processed by the TV (or in most
cases the device connected to the TV by, ironically, HDMI).

So it's not really a method to stream arbitrary visual and audio data to a TV.

~~~
Ajedi32
Well, you _can_ do screen mirroring with Cast, at least on Android. In my
experience though it has fairly high latency and will occasionally freeze for
a few seconds at a time.

------
jd007
Does this mean TB3 (with "only" 40Gbps bandwidth) won't be able to support all
the features/resolutions of HDMI 2.1? This is of course assuming that the HDMI
2.1 spec is designed to fully utilize the hardware level bandwidth.

Also I wonder what exactly is making the 48G cable special. Is it using
additional pins while maintaining backwards compatibility?

------
sametmax
And I was here, hoping HDMI to disapear and be replaced by USB-C.

~~~
protomyth
I guess the HDMI folks thought USB-C insufficiently ambitious on the bandwidth
front. 48 Gbps might hold us for a while on the video front.

[edit]I too wish USB-C was the single connector, but it is such a confusing
standard with all the Alternate Modes and different cables on the market.

~~~
sametmax
You don't say. Right now I'm trying to buy a new cable to charge my XPS 15, I
already bought 2 not working.

~~~
protomyth
Its friggin unreal. I'm trying to do a cable order and have no clue what the
heck to buy. I just want something that works. I swear these alternate modes
and charging-only cables are going to be the end of me.

~~~
thisisblurry
It might be worthwhile to check out Benson Leung's (Google engineer) reviews
on USB-C cables: [http://amzn.to/2hX3cMZ](http://amzn.to/2hX3cMZ)

Previous post from last year on the subject about his work:
[https://news.ycombinator.com/item?id=10508494](https://news.ycombinator.com/item?id=10508494)

~~~
protomyth
Things have changed a bit, now I have to worry about Thunderbolt 3.

------
schmichael
I wonder which causes more pain for consumers: new cables with new connectors
(USB-C) or new cables with backward compatible connectors (HDMI 2.1).

While the former definitely makes consumers unhappy in the short term, it
seems like in the long term the latter's confusion around what cable you have
vs what cable you need is worse.

~~~
Pxtl
USB-C is actually the latter because there are several different flavours of
USB-C cable that support different protocols.

This is going to suck.

~~~
Ajedi32
Yeah, it's gotten to the point where I'd rather just buy Thunderbolt cables
(despite the extra expense) because it's the only easy way to be sure that I'm
getting a cable which fully supports all the features of the USB standard.
(Thunderbolt cables are compatible with USB-C connectors, USB 3.1, and USB
Power Delivery.)

~~~
nicpottier
Even THAT isn't good enough, since not all thunderbolt 3 cables support the
necessary wattage. (for example
[https://www.amazon.com/Belkin-F2CD081bt1M-Certified-
Thunderb...](https://www.amazon.com/Belkin-F2CD081bt1M-Certified-Thunderbolt-
Compatible/dp/B01CEFESRO/ref=sr_1_3?ie=UTF8&qid=1483647457&sr=8-3&keywords=thunderbolt+3+cable))

I'm really hoping this is just a transitionary period and soon all USB-C / TB3
cables are created equal. When type-A started being used for charging there
was a time with similar inconveniences.

~~~
digi_owl
> When type-A started being used for charging there was a time with similar
> inconveniences.

Meh, that problem is still there. On top of how Apple has their own take on
signaling that the charger is "dumb"...

~~~
nicpottier
I remember in the early days you could fairly easily fry a USB port on your
computer if you plugged in the wrong thing. I think they are far more tolerant
these days, though the recent USB Killer doodad that's been going around shows
that extreme abuse can still break things.

It is one thing for things not to work if you plug them in, but when shit gets
broken by doing so then that's just inexcusable.

~~~
digi_owl
I don't think i have ever seen a USB port get permanently fried from having a
device draw too much. I have however managed to fry a cheap thumbdrive by
reversing the connector on the motherboard...

------
kimburgess
Can somebody please introduce the HDMI Forum to semver. While I'm looking
forward to what this will enable, it's a little overzealous for a point
release.

~~~
profmonocle
Agreed. As a consumer, it's a bit odd that HDMI 1.x and 2.0 used the same
cables, but 2.1 requires a new cable. That seems like a good reason to bump
the major version number.

~~~
saghm
It's the same with USB; USB 2.0 and 3.0 were compatible, but USB 3.1 (more
commonly known as USB C) is not compatible with either of them.

EDIT: Actually, from consulting Wikipedia[1], it's even worse than I
remembered. Apparently there's USB 3.1 gen 1 and USB 3.1 gen 2, where USB 3.1
gen 1 is literally just USB 3.0, and USB 3.1 gen 2 is what you get with a USB
C cable (where the letter evidently denotes the port size/shape and the
version number denotes the rate of transfer).

[https://en.wikipedia.org/wiki/USB_3.1](https://en.wikipedia.org/wiki/USB_3.1)

~~~
Ajedi32
> and USB 3.1 gen 2 is what you get with a USB C cable

Nope, it's even more confusing than that. USB-C is just the connector type.
Some USB-C cables only support USB 2.0.

~~~
saghm
> Some USB-C cables only support USB 2.0.

I don't even want to know why somebody thought that was a good idea...

~~~
seanp2k2
Apple does it with the USB-C cable which needs to be purchased separately from
the 87w power adapter for the 15" tbMBP. The cable is only USB 2, but since
it's meant for power only, you'd probably never notice. It's quite annoying
though that it's a separate accessory, and won't function as a "normal" TB3
cable or even a USB 3 cable, and likewise most normal USB-C or TB3 cables
won't work with the 87w power adapter at full power delivery rates.

Such a mess.

------
mtgx
I've always thought 2.0 was a relatively disappointing upgrade over the
previous standard. It felt like it should've had at least twice the bandwidth
it ended up having. This is looking much better.

Also, considering how disappointing 2.0 was, yet it still got the "2.0" name,
this should be at least a 3.0, especially with a new cable coming. The only
reason I see them using "2.1" is because they don't want TV and laptop makers
to yell at them for making their devices look awfully obsolete just a few
years later. But it's probably going to confuse users and it's going to hurt
adoption of 2.1-enabled devices.

------
themihai
Intel has some nice cables delivering 30GB per FRAME[0]. [0]
[https://www.cnet.com/uk/videos/intel-demos-worlds-first-
walk...](https://www.cnet.com/uk/videos/intel-demos-worlds-first-walk-around-
vr-video-experience-ces-2017/)

~~~
comboy
I think he said 3GB per frame, not 30GB, but that's still pretty insane.

------
digi_owl
That bit about different functionality when using different cables etc have me
worried.

~~~
ashark
Good thing we're all switching to USBC for everything, apparently, since it
doesn't have that problem.

[http://www.informationweek.com/mobile/mobile-devices/usb-
typ...](http://www.informationweek.com/mobile/mobile-devices/usb-type-c-not-
all-cables-are-created-equal/a/d-id/1323293)

Oh.... :-/

~~~
lstamour
Yeah, my head started to hurt when you add Thunderbolt 3 to the mix. Then
you've Type-C for USB 2, 3, 3.1 -- and 3.1 gen 1 is really 3.0 re-branded. You
need to look for "USB 3.1 gen 2" for the 10 Gbps support in the cable, and
don't forget charging requirements -- the Belkin cables they sell at the Apple
Store only support up to 60 W, but others, such as
[https://plus.google.com/+BensonLeung/posts/TkAnhK84TT7](https://plus.google.com/+BensonLeung/posts/TkAnhK84TT7)
claim support for 100 W or more. Then you've Thunderbolt 3, where you can
theoretically hit 40 Gbps, but where cables that support those speeds are
basically 0.5m, except one Belkin that's 2m but doesn't support DisplayPort or
USB 3.0. And finally, the white USB Type-C cable Apple sells and ships with
its power adapters is called a Charge Cable and doesn't support data, like
connecting two Macbooks together to do Migration Assistant. But that's okay,
because even if you do connect two computers together with a 0.5m Belkin TB3
cable, you won't hit 40 Gbps. Using Migration Assistant, copying 256 GB off a
disk capable of 3.1Gbps should take about 11 minutes, but it always takes 2
hours when using cables between two 2016 MBPs -- Ethernet gave me the same
speed as that 0.5m TB3 cable -- and it's about a half hour shorter if you use
Wi-Fi for a direct connection. Given that, I can't wait for the day when we
replace all these confusing cables with new directional wireless standards...

~~~
digi_owl
One of the missed opportunities of WIFI Direct was that they didn't specify a
transfer protocol as part of the spec...

------
rocky1138
Why do we still have or still need HDMI when we've got USB 3.1 Type-C?

~~~
ak217
USB 3.1: 10 Gbit/s

DisplayPort 1.4: 32 Gbit/s

HDMI 2.1: 38 Gbit/s

Thunderbolt 3: 40 Gbit/s

~~~
Pharaoh2
HDMI 2.1 is 48 Gbits/s

------
MaxLeiter
It keeps mentioning improved frame rates (very little about refresh rates?) -
will HDMI now be able to support 144hz monitors?

~~~
hocuspocus
DisplayPort works very well for high-resolution and high-refresh-rate monitors
(and also support things like FreeSync), why would you want HDMI?

~~~
pjc50
For some reason all my kit comes with HDMI rather than DisplayPort. It also
works with "televisions" (which these days seem to run Android)

------
pbreit
Didn't see any mention of power. Are the TV sticks going to be able to ditch
the plugs?

~~~
MBCook
Powered HDMI existed before 2.0 didn't it? I know it wasn't in 1.0.

Since these things tend to be backwards compatible I'd assume 2.1 would still
provide power.

I suppose it may not be mandatory.

------
Unklejoe
As for the new cable business, is there a way for the devices to detect the
version of the cable (maybe resistor strapping or something)?

The reason I ask is because HDMI cables are passive, so any cable with all of
the pins populated should be able to work, provided it's able to satisfy the
bandwidth requirements. In other words, I'm wondering if these new HDMI 2.1
cables are just regular HDMI cables with better electrical characteristics. If
so, then I bet older HDMI cables will work as long as they’re kept short.

I also wonder how this variable refresh rate thing relates to the recent AMD
FreeSync over HDMI thing...

~~~
j-walker
I imagine it is like PCI-Express where they link up at the lowest bandwidth
possible (x1 in the case of PCIe) and then keep attempting higher bandwidth
modes (x4,x8,x16) until the the transceivers can no longer link and then step
down one mode.

~~~
flukus
Is this what windows was trying to do when I hooked it up to my TV the other
day? For some apps it kept trying to cycle through resolutions (and never
found a stable point).

------
pasbesoin
Just from the title, it seems this is aimed at the emerging 8K television
segment -- with a little overhead.

As such, I don't find it "wow" at all. Maybe more "just in time."

------
tbrock
I find it really annoying that cables with the same connectors at the ends and
same basic standard have a version today.

Imagine if back in the day it was like: "ohhh headphone jack 1.3 eh? Well
these headphones only work on 1.4 or greater"

Or: "Wow you only have RCA 7.6 you'll have to get a TV with rev 8 to hook up
your NES"

It's a friggin' cord with some connectors and some pins, how did we wind up
having different versions be even possible?!

~~~
ryanplant-au
The cables aren't versioned. They're just rated in categories for attenuation
exactly like copper phone wiring is and named for the connector type just like
USB-A, USB-C, micro-USB, etc are. You don't have HDMI 1.3 and HDMI 1.4 cables,
though; cables bought years before a HDMI version is announced can still work
with any of that new version's features that work within the bandwidth that
cable can carry. The version is for the HDMI spec which the receiver must
support. HDMI isn't just a cable, it's also the hardware encoding and decoding
signals, and the different versions define new and different ways of encoding
and decoding digital media.

As for 'headphone jack 1.4' and 'RCA jack 7.6'\-- back in the day we had
analogue audio and video signals and invented new types of jack, connector,
and cable to carry them. You had RCA jacks, and 3.5mm jacks, and 1/4" audio
jacks, and component video jacks, and S-video jacks, and SCART jacks, and coax
jacks. Now we have digital audio and video and consistently use HDMI to carry
it; instead of inventing new connectors we update the HDMI spec and say
"Systems meeting the 1.3 spec can only transmit 1080p over their HDMI cables,
and systems meeting the 1.4 spec can transmit 1080p over those cables." Isn't
that better, and simpler? Instead of 20 types of cable with different
features, we just use one type of cable, and add features to it over time.

------
ak217
So they are announcing variable data rate streams and Display Stream
Compression support. I thought until now HDMI had a fixed data rate except for
the control channel, unlike DisplayPort which is fully packetized. Can someone
more knowledgeable chime in on whether HDMI will now be packetized too, or how
they support variable data rate modes?

------
webmaven
Hmm. To make actual use of this, it seems you would need to be sitting right
on top of a ginormous (120"?) screen.

You might actually need to use a whole wall, which leads into immersive
environments.

------
SimeVidas
How does this relate to Apple’s advanced USB connectors on the new MBPs? I
thought, they were supposed to replace all other types of connectors for
video. Will we have HDMI for TVs and USB for monitors?

~~~
djrogers
They're not "Apple’s advanced USB connectors" \- they're the USB-C standard.
They support (per the standard) what are called alternate modes, HDMI being
one of them. HDMI 1.4b is currently supported over HDMI alt mode, which is
unsurprising given that USB-C has been in the wild for a while and HDMI 2.1 is
brand new.

~~~
IshKebab
Well they do support Thunderbolt 3 too, so they're not _just_ USB.

~~~
MBCook
Thunderbolt 3 is standardized on using the same USB-C connector.

Much like Apple used the DisplayPort connector before for Thunderbolt 1 and 2.

~~~
IshKebab
I know. But a USB 3 Type C port won't support Thunderbolt 3 by default. It has
to be a Thunderbolt 3 port.

------
shmerl
Still no daisy chaining though, unlike with DisplayPort which uses packeted
data. Why isn't DP replacing it faster? Is it too expensive to become
ubiquitous?

~~~
sprayk
It's a matter of targeted use case. DP is for PCs, while HDMI is for point-to-
point AV. There aren't many non-PC HDMI sources that support multiple displays
(mirroring or expanding), and this is by design. Chaining makes no sense in
the home theater nor building video dist use cases.

~~~
shmerl
The point is, DP offers a superset of functionality and is also supposedly
free of license fees (from what I've heard), while HDMI requires the
manufacturer to pay for the license. So why isn't everyone ditching HDMI for
DisplayPort which is better on all accounts? May be manufacturing cost plays a
role, but I have no idea.

------
XorNot
This will be great for VR basically.

------
transfire
I wish companies would start rolling with HD-BaseT.

~~~
kimburgess
Their current chipsets top out at 10.2Gbps (HDMI 1.4a) for the video signal,
which is somewhat limiting today.

It's also worth remembering that while historically it's done a pretty decent
job at assisting with distance transport it's a 'standard' made by Valens to
sell their chips. The use cases for it also make up a minuscule component of
the overall display source/sink device market so there's little benefit to
manufacturers having to roll the additional cost into every device they make.

------
transfire
Just keep making things more complicated for no good reason.

    
    
         4K   4096 / 2160 = 1.8962962962962964
         5K   5120 / 2880 = 1.7777777777777777
         8K   7680 / 4320 = 1.7777777777777777
        10K  10328 / 7760 = 1.3309278350515463
    

Thanks a whole lot.

~~~
cnvogel
> 4K 4096 / 2160 = 1.8962962962962964

What most people call "4k" is 3840:2160 (typical 16:9 = 1.77… aspect ratio).
At least that's the amount of pixels in most TVs/Monitors I've seen.

------
1_2__3
One of the more ridiculous arguments in favor of the new MBPs was a refrain of
"HDMI is on its way out", "HDMI is old tech", "New equipment won't use HDMI".

~~~
danudey
I've never heard this argument in favour of the new MBP.

In fact, one of the best arguments in favour of the new MBP that I've seen is
that USB-C can replace every port we're using right now, and often with the
same or fewer dongles as before. For example, USB-C supports 'HDMI alternate
mode' which is a simple way to provide an HDMI signal over a USB-C port/cable:

[http://www.hdmi.org/manufacturer/HDMIAltModeUSBTypeC.aspx](http://www.hdmi.org/manufacturer/HDMIAltModeUSBTypeC.aspx)

~~~
givinguflac
Exactly. Dongles can suck, but it's pretty sweet that you can buy a new dongle
to upgrade to a later spec or different port. I'm sure we'll see TB3 HDMI2.1
dongles.

