
Ask HN: Why isn't wireless connection to computer monitors ubiquitous? - aloukissas
There might be solutions in the market (I&#x27;ve seen one that transmits VGA signal using a USB dongle- very low-res for today&#x27;s screens), but not of them are widespread. Instead, pretty much everyone connects to an external monitor with a cable (and if you&#x27;re on Mac, add a dongle or two).<p>Definitely not an EE person, but: probably to transmit UHD data might stretch the limits of the physical layer (although there are some reasonable assumptions that we can make - e.g. very close proximity, clear line-of-sight, etc). But something as basic as casting a Chrome tab to TV works OK and maybe can be improved.<p>HN readers more hardware-inclined than me, I&#x27;d appreciate your responses.
======
photoGrant
It takes a LOT of bandwidth. Otherwise the quality degrades, and we don't
accept 90% quality.

There are options but you start to push towards licensed frequencies and then
your 6 feet of cable free cost $30k

~~~
aloukissas
But even for non-video use cases? Eg development, photo editing, where the fps
isn't (probably) that important.

~~~
aloukissas
Naive question: I can stream 4K to my tv over general-purpose 801.11ac (meant
to be non-directional/line-of-sight). I'd imagine that a purpose built
protocol/phy should be able to push even more bits in sub-1ft distance with
direct LoS, no?

~~~
morabitom
I think the issue with the tv streaming example is that you’d have no idea
what the lag time is, the delay would be two whole seconds but as long as the
video doesn’t skip you wouldn’t even know, versus moving a mouse around which
would make the smallest amount of delay extremely noticeable.

~~~
aloukissas
Latency is a thing for sure. But we're talking a few inches of distance that
the signal needs to travel, so I don't think it's a significant issue. The
latency when "casting desktop" to TV is pretty tolerable (to the point that I
hadn't noticed it until I read your comment).

~~~
easytiger
Try typing when casting for a visceral demonstration of the latency. Also you
might not see it but that casting is generally very compressed video

------
PaulHoule
I think the best technology is WiFi based such as chromecast, miracast, or
airplay.

My TV upstairs is connected to an XBOX ONE that has a Miracast receiver app
installed and it works well with my Windows laptop. I also have a miracast
dongle that attaches to any HDMI port and that is in my go bag.

Still if I want to play a console style game I don't want the laptop on my
lap, so I get a USB-C to HDMI cable and plug it into my A/V receiver and put
the computer wherever -- the cable is so thin and long I could have it on my
lap in the couch if I wanted.

Those tech will benefit from advances in WiFi, if they have a big flaw it is
that frequencies and the radio resources are shared so very well all that WiFi
transmission degrades the Bluetooth channel which makes the game controller
unreliable. Some devices have better radio compatibility with others, but
vendor A hardware just might have trouble with vendor Q in some circumstances
so it is hard to realize a brand promise the way that HDMI does.

Schemes based on other technology have had limited success. I think 60 GHz and
UWB are highly dependent on path characteristics. Even at close range (inches)
a misalignment or occlusion by the metal ground screen in the LED panel or any
other metal could cause the picture to go out.

------
sesuximo
1\. Why should this be wireless? Wires are a robust solution.

2\. You'd need your monitor and your video source (computer, phone, whatever)
to agree on a protocol. I know my Samsung TV could get signal from a Samsung
phone, but my iPhone needs an adapter app. Maybe one day there could be a
standard but don't hold your breath.

------
quickthrower2
Because normally the monitor is pretty close to the computer anyway.

All in ones are a solution to this too.

Necessity is the mo of all invention.

