QMS is unlikely to help with that. It appears to be targeted at allowing a single device to vary its refresh rate or resolution without requiring a complete disconnect/reconnect cycle.
The delay you're seeing is more likely caused by HDCP hand shaking and key negotiation. Depending on your situation, there are external switchers that handle key caching so you can switch cleanly - complete overkill for personal user though as they are not cheap.
> Supporting the 48Gbps bandwidth is the new Ultra High Speed HDMI Cable. The cable ensures high-bandwidth dependent features are delivered including uncompressed 8K video with HDR.
This is interesting, because I've seen several companies promoting "visually lossless" compression (which is apparently a euphemism for lossy compression that the vendor doesn't think you'll notice) to reduce bandwidth requirements for cabled 8K video transmission. Does this represent some kind of defeat for them, or were they targeting applications where Ultra High Speed HDMI wasn't going to be realistic in the first place (e.g. the various systems that transmit video over UTP cable)?
Also, does anyone know how USB 3.1 SuperSpeed+ cables stack up? Obviously they're specified for 10Gbps for USB devices, but that's in the context of USB's requirements for the attached transceivers, so I'm not sure what it implies about the performance of the cable itself.
It also supports DSC[1]. Visually lossless is defined as indistinguishable difference between A/B of predetermined challenging images. You can check the images they used and run the test yourself if you so desire.
Comparing static images is not really an appropriate test for a video compression algorithm, for obvious reasons. You can write lots of stuff that works on a single frame but looks like garbage on a video stream.
Can you give a link to video source files that could be used to compare the two? The link I turned up looks like it needs specialized testing hardware...
Isn't the high bandwidth path for HDMI uni-directional? Not half-duplex, but actually one way only, as in the receiver doesn't have the hardware to transmit back (on those lines). I'm not terribly knowledgeable about HDMI but I think that's basically the case, someone please correct me or elaborate if they know more.
There are some other bi-directional communications channels for control and other features like DRM (HDCP) or even Fast Ethernet but they don't get nearly that much bandwidth.
Those are usually quite expensive. If HDMI requires 48Gb/s capable transceiver chips to be built into every TV, they will get really cheap very fast. It might not be a viable solution in the end, but it could work for some applications.
Over typical HDMI distances, you don't really need an expensive transceiver. 100GbE can be done over direct attach cables that cost less than many current "premium" HDMI cables. Much of the cost beyond that is related to the MAC and host interface being able to do something interesting with the packets.
If you mean switching, you don't need a switch for simply connecting two computers. Presumably that was the use-case supported by "HDMI-as-network-layer".
If you watch around, you can get Mellanox Voltaire rack switches for $200 or less. QDR gear is "obsolete" and often surplused at very attractive prices. Everyone else has moved onto FDR or EDR speeds.
Could anyone provide a guess as to how “quick frame transport“ works?
As someone who likes games the variable frame rate seems like one of the best things in here. I wonder if it would be possible to implement in the current consoles.
I hope the new products announced at CES support this.
I read somewhere that the adaptive response rate will work like Nvidia g sync. A 4K, hdr 75 inch gaming monitor without frame rate hiccups is quite an exciting prospect!
Right. My understanding is it’s basically the same as G sync or whatever the AMD name for it is.
I’m kind of curious to see what a game running at say, 50 frames per second, looks like if the frame pacing is uneven. I imagine it looks worse than 50 frames per second with even frame pacing but better than 50 frames per second with glitches because you’re supposed to be outputting 60.
GSync and AMD Freesync do the same thing, but not in the same way. Freesync is just a brand name for DisplayPort's Adaptive Framerate specification. GSync is proprietary, and requires the display manufacturer to add special hardware into the unit that increases costs a bit.
USB 3.2 (and currently USB-C from it) doesn't support the 48GBps that HDMI 2.1 and 32GBps that display port 1.4 call for. USB supports a max of 20 GBps at currently. It's possible you can get more bandwidth using an alternate mode in the connector, but the spec says you can only guarantee up to the 20GBps over those modes currently. This means the higher resolutions and higher bitdepths (10bit, HDR, etc) may not be possible to push over the connector.
This is why Thunderbolt 3 requires special active cables to get the additional bandwidth by amplifying the signal to help get around the bandwidth limitation of the connector in a passive configuration.
> It's possible you can get more bandwidth using an alternate mode in the connector, but the spec says you can only guarantee up to the 20GBps over those modes currently.
20Gbps per what? One lane? Two lanes? I haven't heard of this, where can I find more info?
(If it's actually 20GBps with a capital B then we're nowhere close.)
> This is why Thunderbolt 3 requires special active cables
> the 48GBps that HDMI 2.1 and 32GBps that display port 1.4 call for
This part isn't right.
Thunderbolt 3 can run 40Gbps over short passive cables. And that's bidirectional. It's the equivalent of 80Gbps for a display cable.
Displayport 1.4, at 8.1Gbps per lane, is actually slower than USB 3.1 Gen 2.
HDMI2.1, at 12Gbps per lane, is slightly faster but still well short of thunderbolt.
I would favor if monitors, displays, and projectors have both USB-C and Ethernet. At the moment all types of displays still have DVI, DisplayPort, HDMI or even legacy like VGA, SCART.
USB-C and Ethernet are capable of 10+ Gbit speed, and the cables are cheap, not the premium one pays for HDMI 2ab cables. Also Ethernet cables with CAT6+ are found in most buildings. If you want to connect a big screen flat TV dozens meters away or a projector 50 meters away, you need repeater devices and fragile cable arrangements, while with Ethernet cables it would be no problem.
The real reason why both continue to be developed in parallel is industry politics. HDMI is the product of consumer electronics companies while DisplayPort is from PC hardware manufacturers. For various reasons these two groups rarely cooperate. One big point of contention is royalties. The consumer electronics companies like standards they can charge royalties for, while PC companies like royalty free standards.
Since DP is royalty free, why wouldn't most consumer electronics companies push for it as well? Only minority would demand royalties, while majority would be forced to pay them. So why not ditch the patent encumbered standard if majority would benefit from it?
The consumer electrons industry is dominated by a handful of companies. Those companies also happen to be the ones who own the patents on HDMI. They control enough of the market that the smaller players are forced to pay up to be able to interop with the dominant companies' products.
Monoprice is selling 10' DP cables for $5 and 10' HDMI cables for $6. Maybe you need to find a new place to buy cables. Amazon Basics has a 10' DP cable for $12.
"Optional 8-channel audio with sampling rates up to 24 bit 192 kHz, encapsulation of audio compression formats (including Dolby TrueHD and DTS-HD Master Audio from v1.2)"
DisplayPort might be cheap, as it's royalty free, but one has to buy a cable dongle to convert it, as displays rarely feature a DisplayPort.
None of my business monitors, TV nor projectors have DisplayPort, but every notebook, discrete graphic card came with a DisplayPort - so they first thing is to buy another dongle to convert DisplayPort to HDMI or DVI.
Because native plain DisplayPort is only available on HP and some other Windows laptops (not sure about the state of the desktop GPU market), and it's rare as input.
The rest of the market - consumer Windows laptops, Apple pre-USB-C-crap-series laptops, gaming consoles, cable TV boxes, other home theater stuff on source side, as well as TVs and projectors on the sink side - speaks HDMI only.
DisplayPort is widely available on business-class laptops, gaming laptops, all modern discrete graphics cards, all modern integrated video chipsets (although not all motherboards/devices will actually provide a port), etc.
You're actually diametrically opposite of reality here: the only devices which do not widely support DisplayPort are cheap crap intended for the low-end consumer market, and stuff exclusively targeted at the living room market like consoles and media-stick PCs.
Even something as pedestrian as my old Thinkpad from 2010 supports it.
Every modern desktop graphics card has display port connectors. A common configuration is 3 display port + 1 hdmi, making hdmi the legacy connector. My five year old laptop has display port. It's not as rare as you think.
At least on desktop GPUs, you tend to get a lot more DisplayPorts than HDMI. You're lucky to get more than a single HDMI port, but three or more DPs are common.
No, the ports on the Macbook are PCIe + Displayport. The Apple displays are PCIe only. The parent was specifically talking about Thunderbolt. You aren't using anything Thunderbolt when you plug into a DP monitor.
For the graphics cards that these laptops have... they aren't like super powerful. I can't, for example, game at 60 FPS on a 4k Monitor on a MacBook Pro. So the use case for needing more than 60 FPS, or more than a 4k display, seems more like an edge case to me.
Mostly what I want is to be able to plug into projectors when I'm with a client so I can share a pitch deck. Instead... I've got this massive USB-C to VGA / USB dongle, and another for USB-C to HDMI, and yet another for USB-C to Thunderbolt for my work monitor. Anywhere I go I've gotta lug around a laptop bag now, and it's full of dongles.
The OLD 15" MBPs were perfect. Good mix of ubiquitous ports, and Apple's proprietary Thunderbolt ports. I have yet to find anything other than dongles that use USB-C. Just like I have yet to find anything other than Apple Thunderbolt Displays that actually use Thunderbolt.
I've found the mini-DP ports tend to get worn down quickly. I've seen that on a desktop and an Intel NUC so far. I don't know if I've just gotten unlucky since atleast on the NUC, I don't disconnect it often.