

Ask HN: Why is HDMI input/picture-in-picture not a thing on PC? - Someone1234

I feel like I have a fairly good technical grasp of both HDMI and of PCs end-to-end. However I cannot for the life of me figure out why HDMI (and similar digital connectors) cannot be trivially read and displayed on a PC (e.g. Windows, OS X, Linux, etc).<p>Back in the VGA&#x2F;analogue days it made a lot of sense why a &quot;WinTV&quot;-type card is required, take the analogue signal convert it to digital, and do a bit of internal processing since CPUs weren&#x27;t powerful enough to handle it solo and RAM was limited (even for the resolution).<p>Now all of those issues are &quot;solved.&quot; HDMI is already digital, the cable and inputs are identical on both ends, and really all a PC has to do is have a driver which hooks the HDMI port and &quot;pretends&quot; to be a monitor&#x2F;digital input, rather than a digital output.<p>So why, purely in technical terms, can&#x27;t I plug my cable TV box into my PC via HDMI and get a little picture-in-picture TV feed? Is it literally just the lack of a driver?
======
Pyrodogg
The flip side to your purely technical viewpoint is exactly why the industry
created HDCP. The 'perfect' data transfer method (w.r.t. analogue copying)
that you see as a major benefit is a potential disaster that would enable
recording and redistributing of near perfect copyrighted sources.

There isn't a compelling technical reason why a PC couldn't have an HDMI in,
it's all legal. The industry would never sanction raw HDMI signals to be read
in by an unrestricted general purpose computer. It would be equivalent to no
DRM at all enabling you to record high definition copyrighted material.

I know that dumping to disk is different than actually reprocessing the signal
to splice it into an immediate output, but i don't think there's much of a
difference there.

If all you really want is PiP, look for a monitor that supports displaying
multiple inputs simultaneously. This isn't a feature that should require a PC.

I'd love to live in a world where actions such as recording and storing
weren't punished just because distribution is a protected right in some
countries.

~~~
walterbell
Hopefully this changes as the volume of CC-licensed video increases.

------
mtmail
Isn't the issue licence fees because your PC wouldn't be an "authorized
device"?

[https://en.wikipedia.org/wiki/High-
bandwidth_Digital_Content...](https://en.wikipedia.org/wiki/High-
bandwidth_Digital_Content_Protection)

------
robocat
1\. expense (an HDMI input costs money for hardware and HDCP license so you
won't see it on cheap laptops),

2\. Majority of people don't want the feature (they might if they could rip HD
content, but HDCP licensing prevents that). Few people otherwise want to use
their laptop as a small screen for another device.

Some expensive gaming laptops have an HDMI input e.g. Alienware M18x has a
HDMI v1.3 input (I have seen it on Asus gaming laptops too.)

Technichally HDMI input cards are available:
[https://www.google.com/search?q=HDMI+capture](https://www.google.com/search?q=HDMI+capture)

------
nevdka
The HDMI port on the back of your computer is purely an 'output' port. There
is no circuitry there to read an input signal. Adding this to existing chips
would make them more expensive, and most customers would never use this
functionality, so the chip companies don't bother adding it.

So yeah... it's a hardware issue, not a driver issue.

~~~
UnoriginalGuy
Where did you get this information? Looking at the HDMI spec what you're
saying makes no sense, everything is just a data channel, there is nothing
specific for input and output.

In fact if what you said was true they would need crossover within the cable
which they don't have, so what you're saying is they null terminate 1/2 of
HDMI's data channels?

Yeah, you need to explain this, it seems utterly at odds with all the
information I can find. And I cannot wrap my head around the design of the
system you're describing (i.e. the logistics of an "input only" or "output
only" digital system).

~~~
wmf
Pins 1-12 (which carry the audio/video data) are unidirectional; some of the
other pins are bidirectional but very low bandwidth.

~~~
UnoriginalGuy
That doesn't remotely address anything I nor the person above said. Nobody is
debating if HDMI as sinks and sources, it does, what is being asked is how
you'd design the source so that it could not act like a sink on the electronic
lines on an all-digital system.

I just read over the spec again looking for what you might be referencing but
I cannot locate it. Here's a link:

[http://www.microprocessor.org/HDMISpecification13a.pdf](http://www.microprocessor.org/HDMISpecification13a.pdf)

Can you be more specific.

~~~
pdx
There are lots of call outs of source vs sink, but page 38 will perhaps
convince you. It shows the sink is the input to a differential op-amp, while
the source is a current source.

The same lines are connected to different circuits, depending on if you're a
source or a sink.

EDIT: This looks like an interesting little box, if somebody does need the
ability to read HDMI. You would still need to save it and then re-stream it, I
think, to get it to your monitor.
[https://www.blackmagicdesign.com/products/h264prorecorder](https://www.blackmagicdesign.com/products/h264prorecorder)

~~~
UnoriginalGuy
The diagram on 39 kind of contradicts that. In fact it indicates they have
exactly the same circuitry.

~~~
pdx
No, it doesn't.

------
wmf
The hardware to scanout from memory to HDMI is different from scanning HDMI
into memory and GPUs just didn't bother to implement the latter. There are
cheap HDMI PCIe cards that are basically just a DMA engine.

