Hacker News new | past | comments | ask | show | jobs | submit login
HDMI Forum does not allow an open source implementation of the HDMI 2.1 spec (arstechnica.com)
290 points by Thomashuet on March 1, 2024 | hide | past | favorite | 163 comments




Luckily with mid and high end monitors, DisplayPort is the standard. I don't understand why TVs don't come with even a single DisplayPort though. Maybe they are paid off by the HDMI forum?


What I always liked about HDMI is that it's basically backwards compatible all the way to DVI, just using passive adapters. I think that's the reason why HDMI dominates on laptops, where you're expected to move around and use projectors, which could then be very old and only have DVI inputs, and it "just works" with a plain old adapter. You could even go the other way round from a DVI output to a modern HDMI projector using the very same adapter, but I think laptops pretty much skipped DVI and went from VGA to HDMI. So in practice it doesn't really matter, especially with DP++ being supported by almost every device, which effectively gives you HDMI output using an almost passive adapter.

But still, I like it. :-)


I recently discovered you cannot pass 75Hz over DVI -> HDMI -> DVI!

For some reason the bandwidth is too low?

So for 60Hz there is interop. but for 75Hz you need a thick DVI cable.


HDMI currently supports up to 144Hz. Even HDMI 1.4 (2009) could reach 120Hz.

However, compatibility is hit and miss in general. You may have had luck replacing one or more cables or adapters, but switching to displayport is likely to get the best results, anyway.


HDMI supports >200hz if you've got compatible hardware (HDMI 2.1).

It's admittedly hard to get this hardware unless you're willing to pay for the top of the line versions, but it does exist.

Budget hardware usually works better with display port however, that's true


jep, bw limit of the digital channel.

had that same issue with a 1920x1200@60 display on old hdmi connection and with a 1920x1080@60 tv on dvi-hdmi.

lowering the refreshrate slightly (59.9 Hz) made it work


HDMI can only carry single link DVI as far as I recall.


When I was working at a place that makes TVs, we were trying to reduce number of physical buttons to cut costs. I imagine HDMI just works and everything that has a display port probably also has an HDMI output


Most people don't connect a computer to their TV, assuming they even have a computer at all.

And assuming they do, they all have HDMI anyway.

As for everything else most people would connect to their TV, they all have HDMI and precisely none have DisplayPort.

Given that, supporting DisplayPort is an unnecessary expenditure on bill of materials and labor for TV manufacturers.


I'm not talking about computers specifically. But as an HDMI replacement everywhere. It shouldn't be more expensive, I'm not saying having an additional port, but replacing one of the 4+ HDMI ports with a DP. I think it's cheaper in fact because it has no royalties. That's why GPUs pushed heavily for DP I believe. Nowadays GPUs usually come with 3 DPs and 1 HDMI.

Just throw one and if the ecosystem grows (like it did on PCs) then you keep replacing HDMI ports with DP ports.


Except that most laptops today have only DP, and basically all USB-C=>HDMI adapters are USB-C => DP alt mode => HDMI adapters.

And outside of gaming consoles it is becoming increasingly more rare to connect anything to the TV outside of the luxury segment (as even build in sound often is better then any external sound up to a price region where the lower luxury segment starts, so buying a slightly better TV without an external sound system is often better then a cheaper system and external sound).

And many of the "killer features" of HDMI (like network over HDMI) are semi-dead.

And DP is royalty free, HDMI isn't. So gaming consoles probably would love going DP only.

So as far as I can tell the only reason physical HDMI interfaces are still everywhere is network effect of them always having been everywhere.

I.e. if there is a huge disruption HDMI might go from everywhere to dying off.

And USB-C is perfectly suited for such disruption (long term).

I mean everything from server centers to high speed interconnects of consumer PCs is currently moving to various forms of "PCIe internally". USB4 and NVMe just being two examples. So it might just be a matter of time until it reaches the console/TV space and then USB-C with USB4+ or Thunderbolt would be the way to go.


Generally agree, except for the sound part. TV speakers are awful. No matter the TV you should always have an external system.


I thought so too until I bought a TV ~1.5years ago and regretted having bought a sound system as it added hardly any benefit (and it wasn't a cheap or low quality sound system either).

Through I do not have the space to have any speakers behind the couch so it's only 3.1 either way.


It’s niche I’ll admit, but a while back the lack of support for DisplayPort in home theater equipment bit me because I had an Oculus Rift (original) hooked up to the same machine, using the only HDMI output on the computer’s GPU which meant hooking it up to the TV (through a receiver) required a DisplayPort → HDMI adapter.

Thing is, DP → HDMI adapters all suck when you’re using them to send anything but a basic 1080p picture. They nearly all fail or struggle with uncompressed 4K. I tried several different cables and adapters and despite marketing claims, they all had trouble. The best was one that was externally powered via USB, but even it exhibited quirks.

I no longer have the Rift hooked up to that machine which freed its HDMI port up, but I too wish TVs and receivers had even just one DisplayPort.


At least a few active DP → HDMI 2.0 are okay at 4kp60. For example, these[1][2] do 4:4:4/RGB 4kp60 fine, though they're limited by HDMI 2.0 to 8 bits per color channel at 4kp60 (higher bit depths work fine at lower resolutions and frequencies, or with chroma subsampling).

In my testing (several years ago), at least half a dozen other similarly-priced DP → HDMI 2.0 adapters purchased from Amazon were limited to chroma subsampled output (4:2:2 or 4:2:0) at 4kp60, which is obviously unacceptable for desktop use, so I do see your point.

I've used the linked adapters successfully now for several years with both a 2013 Mac Pro and a pair of DP-only AMD GPUs (WX 3200 and WX 4100), all connected to a 2019 LG TV, and, while testing, confirmed all claimed signal outputs using an HDFury Vertex.

[1] https://www.amazon.com/dp/B00S0BWR2K

[2] https://www.amazon.com/dp/B00S0C7QO8


It's complicated, but the tldr is: use a DP++ type 2 adapter that advertises 4k compat.

DP++ adapters tell the GPU to output an HDMI signal instead (DP is an entirely different protocol), and then just level-shift the signal. Type 1 adapters are limited to i-forgot-how-many MHz which means no more than 1080p60. Type 2 adapters contain a tiny 256 byte rom that tells the GPU its maximum supported bandwidth.

Other adapters are active, they convert the signal and thus add latency, and often need external power so can get quite hot.


I haven't noticed any perceptible latency from any active DP → HDMI I've used, and I'd honestly be surprised to see any inexpensive active DP → HDMI adapter introducing latency, if only because doing so would require a frame buffer, which drives up costs.

What I have seen is DP → HDMI adapters that only support chroma subsampled pixel formats at 4kp60, and TVs that introduce many frame times' worth of latency due to various post-processing effects.

My hunch is as follows: given that consumer video devices (DVD/Blu-ray, set top box) almost exclusively output chroma-subsampled 4:2:2/4:2:0 YCbCr formats, TV post-processing pipelines may only support these formats, causing RGB (and possibly 4:4:4 YCbCr) signals to bypass post-processing, similar to the "PC" or "Game" modes present on some TVs that do the same.

In other words, if adapter → 4:2:2/4:2:0 and 4:2:2/4:2:0 → TV post-processing → latency, then adapter → latency, even if the adapter itself introduces no significant latency.

If I'm correct, the solution is an adapter which supports RGB output at the desired resolutions and frame rates and/or a TV with a PC/Game mode that bypasses post-processing for all source types, both of which are highly desirable for "monitor" use in any case.


It’s a recently popular thing to use a tv as monitor.


Which all still have HDMI ports. It's pretty rare for a laptop to have DP over HDMI. Even USB C using DP alt mode is easier and cheaper to convert to HDMI. You really have to hunt for a hub with DP.


>> It’s a recently popular thing to use a tv as monitor.

I picked up a 55" curved 4K about 5 years ago for use as a monitor. Now I keep my desk in the living room, so I use it as a TV too (have to move the chair). Curved TV is kind of dumb, but huge curved "monitors" are awesome. You can't find curved TVs any more, and wide curved monitors don't have the height to double as a TV.


It’s a recently popular thing to use a tv as monitor.

Only if you you define recently popular as "More than yesterday, but still a rounding error."


if yesterday is since around 5+ years

because for people mainly playing games but not being over obsessed with having the most expensive fastest hardware ever using a TV as screen has been popular for a very long time, like even when I was in 12grade it was already popular and that was well over 10 years ago


Youtube is littered with videos about it. Has been for over 2 years.


If I were to do this I think I’d try to find a BFGD (big format gaming display), which is a type of screen that’s sized like a TV but behaves more like a monitor and usually has a DisplayPort.


It sounds like there aren't weird Hollywood conte t protections on display port.

My guess is that it's not a technical or even a user experience issue. It's probably a money issue with the deals tv manufacturers make with the media industry.

i.e. Netflix won't allow their app on a device that can circumvent HDCP.


I have 3 monitors, all connected via DisplayPort, and the nvidia control panel says they are HDCP capable. But I'm not sure I've ever used anything that requires HDCP to confirm it (and hope never to).


Quite a few still come with input configurations like 2xHDMI, 1xDP. I wasn't able to find a 4K monitor with 2xDP inputs. Really annoying.


Probably because nothing that plugs into TVs has DisplayPort? Blu-Ray players, receivers, Apple/Android TV, etc.


You can actually blame DVI for this one.

DVI connectors offered analog VGA for years. This meant that graphics card vendors could put one port on their card that did both, huzzah, and a passive adapter got you VGA out of DVI.

DVI is ahead of DisplayPort by 8 years. The DMCA is passed and HDCP becomes a Thing. Many card vendors do put DisplayPort on their cards, since it's the "Professional" standard for video, but that isn't until 2008 or so. DisplayPort would not be widely adopted until 2012-ish.

Fast forward, VGA dies. DisplayPort and DVI-Dual link are there. DVI Dual is forwards-ish compatible with upcoming HDMI displays for TVs, as pushed by the MPEG-LA and DVD makers. In 2009, less than 5% of devices shipped had DisplayPort on them. DisplayPort at this time also cannot handle the two most popular color spaces in use: sRGB and Adobe RGB 1998.

Part of the issue was a perception thing: Displayport was widely adopted by Apple early on, and consumer understanding of Mini DisplayPort was that it was an Apple standard, rather than an open standard, and this further pushed the port out of the limelight.


If I understand it correctly, you can do that, but you cannot call it "HDMI compliant" or otherwise use the HDMI trademark.


Someone without contracts with the HDMI Forum can probably release a working driver. However, AMD obviously needs to license the logo for their GPU boxes and their legal team have decided this prevents them releasing any source code based on the private specification regardless of what the code is called.


> AMD obviously needs to license the logo for their GPU boxes

This poses an interesting question, maybe some of the hobby-lawyers on HN like to chime in and post heir theories :o)

Let's assume they print that Logo on their box, call it HDMI in their Windows drivers, but don't do so in their Linux drivers, while it's still a spec-compliant implementation. Would that pose a potential legal problem, and if so why?

If it's the fact that they have access to the official HDMI 2.1 spec, implemented that, but call it something else, which I could imagine they forbid in some contract, would things change if some random hacker with too much time on their hands reversed the protocol by sniffing it, implementing it for the AMD driver (again without calling it HDMI)?

Too bad the HDMI forum doesn't feature an email address on their home page, I'd have loved to tell them what I think of them.


You don't just sniff a 48Gbit/s protocol as a random hacker. That's not what happens in the real world. A real-time scope that could do that is in the $1mil dollar range.

At best a random hacker will reverse engineer the binary driver enough to make something work in some capacity.


Yes you actually can. It's known that HDMI is TMDS and that the fastest frequency on any given pair is 680MHz and there are a total of 13 data pins (4 pairs + i2c + CEC pin + hot plug detect pin + reserved pin for some special features). A digital logic analyser that can sample at that rate over all 13 pins is going to cost less than a grand. If you stub in some hardware to convert the differential pairs back to a single hi/low signal and drop the optional features of the reserved pin, you can cut that down to 8 signals (or less if your analyzer has dedicated clock signal pins). A DSLogic U3Pro16 is 299usd and can sample 8 signals at 1GHz in buffer mode or 3 pins at 1GHz indefinitely in streaming mode.

If you know roughly what you are looking for, you can set triggers to start sampling when the event you care about starts, that's more than enough to be able to reverse engineer even the most intensive of the existing HDMI spec.

Given that a lot of these graphics cards cost substantially more than 300usd, it's not unreasonable to expect a logic analyzer capable of digesting HDMI to be within their price range.


A lot of those $1mil dollar scopes run Linux and connect to fast, high resolution displays; some of them are made by companies not always in love with US licensing.


Sure, but this is a hypothetical scenario, focusing on the legal aspects.


Legal aspects of something that can never happen? Why?

Try legal aspects of reverse engineering a binary driver. That's more realistic.


But the question is not whether it's legal to reverse engineer a binary. The question is whether someone with no affiliation to neither the HDMI forum nor AMD could contribute the according code to the AMD driver which is part of the Linux Kernel, regarding legal implications.

Reverse engineering of binaries is pretty much a solved problem from the legal standpoint.


It's hypothetical. There is no difference between those two scenarios, we could also just assume the hacker pulled it out of their ass...


Fine, but it's useless then.


I mean a $1M is crowd sourceable if there was a trustworthy person to do this.


There is a contract between AMD and HDMI Forum covering the licence to use the logo. We don't know the wording, and I am sure it is thousands of pages of dense legalise; but it can literally say: "We promise not release a Linux driver compatible with the spec.".


> Too bad the HDMI forum doesn't feature an email address on their home page, I'd have loved to tell them what I think of them.

They actually do at the bottom of [1].

[1] https://hdmiforum.org/about/


Thanks, I emailed them: # Solutions for the growing open source user base "Hello, I am an AMD user, who uses Linux, and I currently use a small HDMI screen but I am looking to upgrade to a larger screen and I am concerned about the forum's decision to deny AMD's open source HDMI implementation. This decision has led to criticism of the entire HDMI enterprise as rent-seeking, and encourages non-trademarked implementations that weaken the HDMI brand. Linux is only growing in popularity, as people are tired of the advertising baked into Windows, and don't always want a Mac either, I highly suggest a re-think that looks at the rising popularity of open source with Linux, Android, Chrome and so on, to protect the HDMI brand.

Sincerely, Luke Stanley"


Also, that someone without contracts would probably need to call it different than "HDMI", because the term itself is trademarked and only licensed to Members (like "Wi-Fi" and "Bluetooth").

So a different term would be needed (like "WLAN" or "BT" is used by non-members)


As much as we can talk about HDMI here, as it's a name in the public domain, wouldn't someone else be able to use that kind of speech to make a "HDMI-compatible" implementation? "IBM compatible" seemed to be a designation, in the day, that didn't run up against legal trouble?

Maybe that's what you already meant by "calling it different than".


HDIY


Could then just call it: HDMIY, no?

High Definition Make It Yourself


Trademark law always hinges on the question if a reasonable consumer with reasonable knowledge could be expected to distinguish those trademarks or not.

HDMIY would most probably not fly, HDIY would probably.

not yet a lawyer, though.

edit: I'm also fairly certain that the HDMI forum would fight HDIY tooth and nail, so whoever goes public with that absolutely needs deep pockets.


I am pretty sure that the term being a trademark doesnt stop me from mentioning it when describing compatibility. Do you think apple could sue me if i release a program and say "This software is compatible with macOS"? As long as I am not impersonating them, using the Logo etc., they probably can not stop me.


Of course they could sue you, and so could I. Neither of us would win, though. I because I don’t hold the trademark, they because they explicitly allow you to do that.

Like any big company, they have guidelines outlining permitted use. Apple’s page (https://www.apple.com/legal/intellectual-property/guidelines...) says:

“2. Compatibility: Developers may use Apple, Macintosh, iMac, or any other Apple word mark (but not the Apple Logo or other Apple-owned graphic symbol/logo) in a referential phrase on packaging or promotional/advertising materials to describe that the third party product is compatible with the referenced Apple product or technology, provided they comply with the following requirements.

a. The Apple word mark is not part of the product name.

b. The Apple word mark is used in a referential phrase such as “runs on,” “for use with,” “for,” or “compatible with.”

c. The Apple word mark appears less prominent than the product name.

d. The product is in fact compatible with, or otherwise works with, the referenced Apple product.

e. The reference to Apple does not create a sense of endorsement, sponsorship, or false association with Apple or Apple products or services.

f. The use does not show Apple or its products in a false or derogatory light.”


Ok sure, they could sue me, but they would not win. Do you think they would win if they had not explicitely allowed that usage? I don't have a source but I am very sure that you can't stop someone from using your trademark, the trademark only protects the use in business, i.e. me selling a OS called "macOS". Just mentioning the name is no trademark infringement.


I'm pretty sure your legal analysis here is overly-simplistic, but we don't need to get into that.

Going back to the original topic, if somebody is trying to release open source drivers for HDMI (or a protocol compatible with HDMI), obviously trademark infringement is a possibility since both are in the exact same business. So even if you're right (which I don't think so), your argument doesn't apply.


Being in the same business does not make it trademark infringement. Ubuntu could also advertise "This OS can run software for Windows by using WINE" without infringing on Microsoft's trademark, because they aren't acting under the trademark. Trademarks exist so you can't sell your product under the established name of another company and profit from their reputation or ruin it by selling bad products. Just mentioning the name to state a fact is not forbidden. Just writing a driver and distributing it with the claim "HDMI-compatible" is not trademark infringement. They problem in OP's article is that AMD has a contract with the HDMI Forum.


My comment does not contradict anything you say here.


Do you think it would be trademark infringement if I code a HDMI driver and then publish it on GitHub with the description "This is a HDMI-Compatible driver"?


I do know, for example, that sports talk radio shows aren’t allowed to use the term “Super Bowl”


i'm pretty sure apple would and could sue you if you do that.

but it really depends on how much attention you get

gotta keep an eye on the nintendo emulator lawsuit.... the war on general purpose computation carries on


Saying "compatible with <trademark>" is if anything one of the best established examples of nominative use. Like PC-compatibles which IBM had no way of stopping. Apple could try to sue, but the case would get dismissed immediately.


Any time I've seen that on actual packaging, it does come along with "<trademark> is a registered trademark of <corporation>". Though usually those same boxes use official published logos from <corporation>.


Apple actually explicitly states that you may do this: https://www.apple.com/legal/intellectual-property/guidelines...

Though even if they didn't they wouldn't have a legal leg to stand on if they sued.


I checked and found this info from the Wi-Fi Alliance [0]:

> Select brands and logos are offered license free and intended to be used widely throughout the Wi-Fi ecosystem by Wi-Fi Alliance members, non-members, industry partners, media, and analysts to describe products, technology, network deployments, and operating system support.

It sounds to me like anyone could use the trademark "Wi-Fi", since it's listed in the freely available ones.

[0] https://www.wi-fi.org/who-we-are/our-brands


> It sounds to me like anyone could use the trademark "Wi-Fi", since it's listed in the freely available ones.

Yes, to describe Wi-Fi as defined by the Wi-Fi Alliance, not to describe anything the Alliance does NOT consider compliant.


Trademarks are there to protect consumers from dishonestly represented goods. If I buy some display tech marketed as "HDMI-compatible" but unlicensed, and it works with my other HDMI equipment (say, its maker reverse-engineered the HDMI spec), no one has been defrauded.


Call it IDMH, I don't mind, honestly.


Or IWCH ;-)



>> However, AMD obviously needs to license the logo for their GPU boxes...

It's not just that, it's about playing nice so they can have access to the next version of the spec if one comes along.


Reminds me of

>Video Out port: DVI-D signal in 640х480 px, 60 Hz. The port supports a well-known video standard that we can't name due to copyright limitations The first letter is H, and the last one is I.

https://blog.flipper.net/introducing-video-game-module-power...


Sounds about right, Flipper Devices just released the Video Game Module which has a very recognisable port that is never referred to by it's more common name anywhere.

Of course, they don't do HDMI 2.1 or anything advanced like that, but I guess the reason for the name not appearing anywhere is the same as you're discussing here.


Would be nice if everyone referred to a "non-compliant" HDMI as HDIY.


Then, you can't get HDCP keys and HDCP IP on that device, so trying to do that opens cans of technical and legal worms.

Not nice. Neither worms, nor HDCP forum.


Lack of HDCP is a feature in my eyes.


I'd also prefer my cables to carry clear signal unless I encrypt them, but content creators feel that its their duties to protect us from "teh pirates" (I mean their money from us).

On a serious note, I don't believe that these technologies prevent a determined person from copying anything. However, I'd rather have all the features on board rather than not being able to use them when required.


I can live without the kernel being "HDMI compliant" and coming with an HDMI logo.


The question is why do we use proprietary formats and specifications such as HDMI, DisplayPort, Thunderbolt, Lightning.

Aren't we able to create and use open standards?


> proprietary formats and specifications such as ... DisplayPort

DisplayPort is not proprietary; it is a VESA standard.

> Lightning

If you owned any iPhone between 2012 and the present, you had to use Lightning, without choice. And for some inexplicable reason, Americans love iPhones.


People keep iphones for a long time. My unfair internal impression is that android phones still have micro-usb and allow any app to change your lockscreen to porn ads without identifying _which_ app made the change, because when I switched to iOS that was true, and I’ve only had two iphones so it can’t have been that long. (In reality it’s been eight years)


> allow any app to change your lockscreen to porn ads

I have never heard of such a thing. Was it a common problem for you?



8 years is a long time in tech. Type-C ports are now commonplace in the Android world. The modern Android ecosystem is no longer the chaotic mess we saw during the Android 2.x era -- its permission system is a lot more complete now, and many features are locked down by default unless you explicit grant the applications permissions to use them.


People keep all phones for a really long time. I've never had a phone that I didn't use for 5 years, nor one that ever cost more than $350.


> Aren't we able to create and use open standards?

In theory, sure. In practice you'll have to construct a financially sustainable organization that is able to motivate all interested parties to chip in and is also able to certify the implementation and at the same time also doesn't fall victim to internal corruption (e.g. high C-level compensation making it unsustainable). I think there are few-to-no precedents for that in the open source space in general, and even less when it comes to standards body organizations for maintaining a standard at that level of complexity.

In most domains proprietary specifications form the backbone of everything. A lot of governments refer to ISO standards, which by default are not open access.


> In practice you'll have to construct a financially sustainable organization

Here's a list of just the best known ones [0]. There are literally hundreds of thousands of open standards for everything from communications to mechanical engineering, to packaging to chemical formulas....

They make the world go round.

No piece of technology you use today, especially the Internet would function without open standards and standards bodies.

For some reason bits of the digital tech industry, in particular media and entertainments, have a parochial disconnection with the rest of reality and forget that they stand on the shoulders of giants and operate with the assent of everyone else in the world giving them the standards space within which to work.

[0] https://en.wikipedia.org/wiki/List_of_technical_standard_org...


There is next to nothing "open" (if we go with "open" = "open access") about most of these organizations.

As a fun experiment, I went through the first 10 entries of the Wikipedia list. Only one of them[0] produces _open_ standards, which they have available for free download. For the rest of them the "standards" link on their website either directs to a webstore to purchase individual standards or to a membership signup.

I very much recognize that the world we live in is driven by standards. But while those standards drive the world forward, I think it's also important for industries (and governments) to recognize that the way their standards bodies operate in a way that's almost fundamentally incompatible with the forces that drive innovation in the software world (that they often proclaim that they also want in their industry).

Building a standards body as you point out isn't difficult and has been done many times over. What's difficult is building an _open_ standards body, which as of today looks like a mostly unsolved problem (same as open source funding).

[0]: https://en.wikipedia.org/wiki/Accellera


You're right it's scandalous that "open" standards touted for public safety and interoperability are sold for a fortune, excluding any small business, curious individual or inventor.

That's congruent with the problem of parasitic publishers like Elsevier who gouge publicly funded science and hold the world's papers to ransom.

Practically, theremedy is the same; there's nothing in the above list I can't find with a little effort via bit-torrents and Tor hidden services. It's a small inconvenience to do the legwork and then clean any PDF files for potenial malware.


There’s plenty of examples of successful open standards out there. The real issue is that, when it comes to video, there’s a lot of money in closed solutions because of “piracy”.

The moment you want to control what people watch and how they watch it, you lose any hope of having an open standard.


> In practice you'll have to construct a financially sustainable organization that is able to motivate all interested parties

Translation by obsolete sandwich-fed LLM:

In practice, zealously litigious organisations will assemble corporate lawyers in a room to compute profits and to define access constraints for consumers.


> In most domains proprietary specifications form the backbone of everything. A lot of governments refer to ISO standards, which by default are not open access.

Standards documents being behind a paywall is not at all the same thing as something being proprietary or needing to be licensed. ISO charges for standards documents to pay their administrative costs, you can implement those standards without paying any extra money. And if you happen to have an alternative way of implementing the standard without reading its document, that is fine too. If you implement JPEG XL by studying its open source reference implementation, that is A-OK.


Something being behind a paywall and copyright being used to prevent its redistribution is one of the textbook definitions of "proprietary".

In almost all cases of standards you can implement those standards without reading the document, from an IP standpoint. From a practical standpoint it is often just not feasible to reverse-engineer everything without the original documentation, or worth it if you can't slap the trademarked name of the technology on it.

It may be possible that AMD could even implement an open source driver stack for HDMI and be legally in the clear. What they fear is more souring the relationship and losing access, so they don't risk it when they were told not to do that.


meaning that in practice we would have to fundamentally tweak how capitalism (maybe?) works somehow

in the mean time keep paying taxes, rent, subscription, and utility bills.

i'm not even sure it's capitalism that needs to get modified? maybe it's something about how private/public property works that is cleary off the mark and needs updates?

I'd argue that in the internet specifically, open sourced implementations of the protocols are the backbone of everything not closed proprietary specs; aren't most internet specifications open?


In the purest of the pure internet, sure (depending on what level of the stack you look at). There you of course have other problems(?) like the fact that most browser-related standards are essentially steered by Google (via its dominance in browser market share). From the parts of the W3C that I've observed, I'd also not characterize them as a functioning standards body (I'm not sure they've published anything meaningful in the last decade).

But in many spaces where you interact with the "real world" you very quickly make contact with proprietary ISO standards (e.g. CAD, architecture). I'd argue that this is one of the big contributing factors to why there isn't more open source penetration in CAD, as central standards like STEP[2] would require contributors to purchase a number of ISO standards.

There are also some spaces where proprietary standards exist (usually when open implementations precede the standardized ones) like SQL[0], but the proprietary nature is ignored, as most people don't need their SQL implementations certified. AMD can't do that as they need to keep a friendly relation with the HDMI Forum for official certification.

There are also some ISO standards (associated with JTC1) that are open access[1], which seems like a decent model. I'm not sure who usually foots the bill for the whole standardization process here though.

[0]: https://www.iso.org/standard/76583.html

[1]: https://standards.iso.org/ittf/PubliclyAvailableStandards/in...

[2]: https://en.wikipedia.org/wiki/ISO_10303


It sounds like you're not aware of the reason. I personally believe that the reson is that there are many entities pushing against open standards since they would make copying easier and DRM harder.


This is correct. That's also why there is an ongoing trend to push users into walled gardens and onto hardware they do not control. Or as Cory Doctorow calls it the "war on general-purpose computing"


It's a battle for the control of the device. The ideal setup for the big corpos is that we do not own our devices, but we rent them, and they can then charge arbitrarily for different uses of them. This was what AT&T did with phones back in the day.


To expand on that, AT&T had a regulated monopoly. You were not allowed to attach non AT&T devices to their network, and they had the only network (and no one else could set one up by law). You did not own a phone, but leased it on a monthly basis.

When the telephone answering machine was invented (not by AT&T) you could not legally attach it.

The plus of this arrangement was that AT&T made devices that would last for decades, all areas of the country got service (regardless of wealth or population density or politics), and costs were not surprising. The minus of this arrangement was that innovation was stifled and some costs were artificially high. It became a competitive disadvantage that was holding our country back from innovation. There is a correlated timeframe between the breakup of AT&T and the massive expansion of the computer industry.


> costs were not surprising

Costs were mind boggling and surprising. I got in huge trouble when I was in elementary school because I called my best friend who lived 1 mile away. It could cost more to call someone who lived in a different LATA (in my case, the other LATA was one mile away) than it did to call a different time zone. As I recall a call 1 mile away was billed at 23¢ per minute and a call to Portland, Oregon, three time zones away was 12¢ per minute (and this was in 1977 money, which where one dollar was worth six hamburgers). Another: the cost to rent a handset was $3 per month. By 1988 (and $1 was worth four hamburgers), you could buy a touch tone phone for $3 at Target.


Absolutely right - I'd forgotten about that. The fun of regulation driven services.


Limiting choice and freedom is never a good answer to an issue.


An open standard is pretty useless without hardware that uses it. The makers of proprietary formats know how to prevent that.


> Aren't we able to create and use open standards?

You can, but you need monitor/graphics chip companies to use it. These are mostly the members/creators of the HDMI standard, they are also probably the best able to create a standard that others will use:

https://hdmiforum.org/members/


Someone needs to reverse engineer and publish the spec ASAP. This is ridicolous.


So, DisplayPort it is then..


The computer related companies should have given the middle finger to the media Mafiaa long ago and tell them to STFU if they ever wanted their content to be displayed in a computer with PC based standards and not MPEG.

Once you get a libre OS you can dump the content of the BUS or fake whatever HDMI hardware out there to get pristine audio and video frames. Also, the current Hollywood movies are very subpar compared to what we had in the 90's, so who cares.

My SO has an Amazon Prime account and yet they want to show adverts in middle of a media she already paid to be displayed without ads in theory. So, you are paying them twice. Thus, I don't consider Bittorrent piracy when you legally paid for a service but the streamers can break out the rules anytime.


Sure. Try dumping SkyShowTime shows/movies then. (refuses to run on GNU/Linux).

The average power user won't be able to run SkyShowTime on Linux. The idea is locking everyone on Windows, OS X or Linux with Secure Boot and verifiable boot chain if you want to watch movies or TV shows.


The irony is customers have a more reliable service with pirated shows nowadays, they always play and the quality doesn't change.


Crackers will just use kvm for virtualisation and dump whatever they want either directly or over SPICE.


Probably. Though I'm still wondering why hasn't Microsoft implemented some kind of roles along Verified Boot.

No-one, in Hollywood's view, should be able to watch protected content on a VM. Plus for me it's choppy. Waiting for better virtio graphics drivers for Windows (bugs exist in Visual Studio with HW accel on and there is no 3D support).


Capturing content like that is something you'd do with GPU passthrough, not virtio drivers.


But Windows knows it's being virtualized. Why doesn't it disable Hi-def output when doing so?


A possible workaround for users is to use an active HDMI adapter, e.g. a USB-C dock or DisplayPort → HDMI converter.


Last I checked those converters do not support HDMI 2.1 features like VRR. If the end output is HDMI 2.0 you might as well skip the middleman.


There is extensive discussion about adapters in the bug report thread, it seems that some adapters do actually support HDMI 2.1 features, but it depends on their chipset and firmware version. I got a CableMatters one that seems to work, to the best of my ability to test without my TV having a diagnostic panel to show info about the signal it’s receiving.

https://gitlab.freedesktop.org/drm/amd/-/issues/1417


The Steam Deck dock supports this, for instance.


are you implying that DP-HDMI cables have some active components in it that modifies the signal?


Yes. There is dual-mode Displayport which works with passive adapters, but I've never actually seen it and type c doesn't support it at all.


Yes, some of them do. “DisplayPort++” ports support passive HDMI converters, while “pure” DisplayPort outputs do not.


Ars Technica has to be the most careless tech news outlet and I stopped reading them long ago because of how much they just get factually wrong because of woefully inexperienced writers and to grab attention.

Their first sentence is "Any Linux user trying to send the highest-resolution images to a display at the fastest frame rate is out of luck for the foreseeable future, at least when it comes to an HDMI connection" but that's plainly not true. Hardware with closed source drivers, such as the standard nvidia ones do support those because they don't have this legal limitation. Then they even end it with that possibility of closed source AMD and didn't bother even asking themselves if anyone else has done this.


You're right, but on the other hand, I like that they mentioned DisplayPort as a viable alternative that doesn't suffer from the problem at hand. Not too thrilled about the exact phrasing, describing DP as "the likely best option", which is too opinionated for my taste. But I'd appreciate if other publications did this, so that e.g. every article about Microsoft's subscription model for Office 365 mentioned LibreOffice, every article about Steam users losing access to some of their games mentioned GOG, and so on. Especially when the alternatives are not well-known among one's readers.


Phoronix is the place to be for coverage of this type https://www.phoronix.com/news/HDMI-2.1-OSS-Rejected


I was like: Is Google in the HDMI Board? No, neither Microsoft.

https://hdmiforum.org/about/hdmi-forum-board-directors/

I see people from Apple Panasonic Sony Nvidia Samsung etc.

Hardware companies. Maybe you have to buy your way in the club.


FWIU HDMI leaks signal whereas DP DisplayPort is designed to reduce emanations, anyway. https://news.ycombinator.com/item?id=36681814#36685387

Almost anti-competitive that the HDMI 2.1 spec people won't allow an open implementation.

That they don't even allow open implementation should have been a red flag to all of us that HDMI 2.1 has not been subjected to sufficient review.

Have any of you sufficiently reviewed an actual implementation of this spec? Only with black-box testing because it's closed source?


On a somewhat related note, it's very surprising how hard it is to get DisplayPort out of a USB-C connection. Although DP is natively there all the cheap adapters tend to be HDMI even though that requires extra hardware to create the complex HDMI signalling out of the DP native output. So although monitors tend to have DP inputs and now everything has DP outputs in the form of USB-C it's actually hard to connect the two.


Maybe I'm missing something, but it looks to me like Amazon even creates their own off-brand USB-C to DisplayPort cable, so it doesn't seem like it's that hard.

https://www.amazon.com/AmazonBasics-Aluminum-USB-C-DisplayPo...

They also have an adapter: https://www.amazon.com/AmazonBasics-Aluminium-DisplayPort-Ad...


There are some options, particularly if you want just DisplayPort. If you want to have power as well for example pretty much all the cheap adapters are HDMI only. With HDMI there are many more options and of more reputable brands usually.


I was pleasantly surprised that my new Dell monitor connects with USB-C cable to to my laptop, and then lets me daisy chain another older Dell monitor off of the first one with a DisplayPort cable. I’ve got power and 2 monitors, plus a full USB-C hub, with just 1 cable into the laptop. So at least Dell has figured it out.


What is stopping AMD from distributing a closed source binary blob along with selling the access to proprietary source code for $1?


The source code is not a clean room re-implementation of the the HDMI 2.1 spec and as such contains IP owned in one way or another by the HDMI Forum. This would almost certainly preclude them from distributing source code to non HDMI Forum members under any license according to what I could glean from the article.


Ah, thank you, this is interesting. So in a way the original IP is kind of poison which forever taints their source code, diminishing what they can do with it.


> you can't make an open source HDMI 2.1 driver because some people said so

I don't understand. Fuck whatever committee said whatever crap. Open source is open source. Just make the damn driver and give the suits 2 middle fingers.


As a Member of HDMI-Forum AMD would surely break its Forum Membership contract by open-sourcing their implementation of a specification restricted to Members.

Also, "HDMI" itself is a trademark with usage only allowed to its Members (like "Wi-Fi"), so even if a non-Member would do an open-source HDMI-implementation against the will of the HDMI-Forum, he would likely not be allowed to call it "HDMI" (like "WLAN" is used by companies without Wi-Fi Alliance Membership)


That is a small hurdle. Just come up with a name. HD driver? If that is too close then something else. End of the day call it bollocks and since you are using hardware detection anyway who cares.


It would be possible to argue that both these terms are generic enough that the trademark lost its meaning, it wouldn't be the first time.

WiFi is pretty much there already


Yeah, I think someone other than them would have to write that driver. But it's possible nobody who has the freedom to do so, has the knowledge to do so.


Context: AMD is a company and those usually doesn't go around flipping the bird.


And it's a bit unfair to complain about AMD in this context anyway, not only were they the ones that proposed an Open Source Driver in the first place, from my experience their open source Linux support is excellent and in regards to flipping the bird, compared to NVIDIA they seem to do great up to date work with the Linux Kernel Devs.

This is the HDMI Group that's being Idiots in this case because the Content Mafia is afraid someone might use this to steal content. Anyway, back to torrenting stuff...


it's time to call HDMI legacy, just like DSUB and DVI DP and/or USB-C + DP alt mode is the way


I still miss DVI.


I don't miss it, DisplayPort is a really good replacement.


It is. And this is the likely to be the solution to this issue. DP is a VESA standard, and its Dual Mode feature supports passive adaptors for, among other things, HDMI displays. I use one on my 20-series RTX card.

A cheap DP-HDMI dongle makes all this go away. As long as VESA doesn't behave the same way, anyway.


That works somewhat well for 1080p but all adapters I have seen max out at 30fps for 4K video. Might be ok for movies but gamers might not like it.

55" monitors with DisplayPort do exist but they are only a selected few and seem to cost 4 times the price of a 4K TV.

So 4K is still pretty much a luxury that I will just ignore for now.


> A cheap DP-HDMI dongle makes all this go away. As long as VESA doesn't behave the same way, anyway.

You can even get cables that are DP on the input and HDMI on the output with minimal bulk.


But this feature works by transmitting HDMI signals so you still need to implement the software part in question.


No, it doesn’t. Hdmi alt mode doesn’t exist in the real world.


I think the connector good be better in terms of reversibility and none of that weird locking it has but it's an incredible technical achievement. And it seems perpetually ahead of HDMI in terms of features and Bandwidth.


> And it seems perpetually ahead of HDMI in terms of features and Bandwidth.

Not for the past few years. Although DP 2.0 has been 'released', there are no products actually shipping with it, and in practice DP 1.4 is the latest standard. DP 1.4 can't transmit 3840×2160 at 4:4:4 144 Hz without Display Stream Compression (DSC). HDMI 2.0 can.


> Although DP 2.0 has been 'released', there are no products actually shipping with it

Here's a GPU with DisplayPort 2.1 https://www.mindfactory.de/product_info.php/16GB-Sapphire-Ra...

Here's a Display with DisplayPort 2.1 and higher Resolution&Refresh Rate than you specified https://www.mindfactory.de/product_info.php/57Zoll--144-78cm...


That monitor was released not three months ago; I was describing the 99.99999% case. DP 2.0 is rare enough that an article[1] was produced this year about its scarcity.

[1]: https://tftcentral.co.uk/articles/when-is-displayport-2-1-go...


Is there a point for a monitor to use DP 2.0 when the only formats it can display are all supported by 1.4?


And there is DP-Alt for USB Type-C, which makes it much nicer than a regular DP connector.


Locking is a feature, not a bug


Especially when your monitor falls down from your desk instead of simply disconnecting at the plug.


Easily solved by proper cable management.


Locking in general yes, the specific implementation of locks on the DP Plug are annoying.


It was all about SCART, which is probably the only port worse than USB for having to rotate the connector several times before getting the correct orientation to insert it


I don't miss all the variants of DVI. DVI-D, DVI-I, DVI-A, single-link, dual-link. But people would just see the cable and think, "ah, I know this, it's a DVI cable"


I've decided to skip 4K for now (at least a decade), switching to the 2560x1440 resolution (finally switched from 1200p)

I normally run it on 120Hz over DP, but it will work fine over HDMI 1.1 at 60Hz. My (5+ years old) TV runs at 1080p/120Hz just fine too.


The connector (resp Port) was too chunky for laptops and you didnt get sound over dvi, but other than that it worked just Fine each and every time. Never had issues upgrading stuff from VGA to DVI in an industrial Environment back then.


Apple had a full-size DVI connector on their PowerBook G4 series of laptops.


All the G4 PowerBooks I remember used mini-DVI?

https://en.wikipedia.org/wiki/Mini-DVI

Which is cool, but not quite as cool as micro-DVI, which they used for exactly one MacBook Air generation before mini-DisplayPort arrived on the scene

https://en.wikipedia.org/wiki/Micro-DVI

Edit: you are right though. I just pulled up pictures on early G4 variants, and there is indeed full-size DVI. I completely forgot that.


I actually like not having sound in the same cable, it has generally made life more complicated. But my use case could be different from others.

Even DVI had HDCP though, so it too was flawed :-( https://en.wikipedia.org/wiki/High-bandwidth_Digital_Content...


DisplayPort has been good to me




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: