
The Low Latency Live Streaming Landscape in 2019 - mmcclure
https://mux.com/blog/the-low-latency-live-streaming-landscape-in-2019/
======
jdietrich
Interesting example: there's currently a major squabble in the British
gambling industry over the use of drones to provide low-latency streams of
horse racing. Off-track betting is legal here, as is in-play betting - you can
place a bet right until the winning horse crosses the line.

The latency for a typical satellite broadcast is about 10 seconds, so several
gambling syndicates operate their own drones to provide a private low-latency
stream, thereby gaining a huge advantage on in-play betting. The drones are
being flown legally in uncontrolled airspace, so there racetracks are at a
loss as to how to respond.

[https://www.theguardian.com/sport/blog/2019/jan/16/talking-h...](https://www.theguardian.com/sport/blog/2019/jan/16/talking-
horses-racecourses-vow-to-ban-drone-giving-punters-an-edge-horse-racing-tips)

[https://www.racingpost.com/news/open-skies-authority-says-
no...](https://www.racingpost.com/news/open-skies-authority-says-nothing-to-
stop-drones-being-flown-near-racecourses/364002)

~~~
throw_away2
How do they prevent the people at the actual races from sending out
information and skipping the whole drone complication?

~~~
jdietrich
They can't. Having someone in the grandstands with a mobile phone used to be
the norm, but a drone provides more information with less latency. As in
financial trading, there's a huge financial advantage to knowing more than
your rivals or knowing it faster. The time it takes to say "two and five have
fallen" might be the difference between profit and loss on a race.

~~~
fireattack
I knew nothing about horse racing so bear with me:

You can still bet when the game has already started or what? Because
otherwise, I don't quite understand why lattery would matter in this case.

~~~
philtar
You can bet until the last second of the race

------
mgamache
Chunked video over Websockets (or chunked transfer) and scaled out WebRTC are
just hacks. Browsers need to support proper low latency video streaming.
There's a reason Skype and Zoom use desktop clients. If you are wondering,
WSAM is not the answer, you need proper access to the hardware decoders,
especially on mobile devices.

We had pretty good low latency video a decade ago with Flash but of course
with HTML5 you don't need Flash </sarcasm>

~~~
pier25
Hangouts or Google Meet are notoriously crappy compared to Skype.

Appear.in is also WebRTC based and seems to work better, but screen sharing is
not as good as Skype either.

------
GeneticGenesis
Author here, there's lots of interesting innovations happening in this space.
If anyone knows of anything interesting I've missed, please let me know!

~~~
cagenut
Is there anything more you can talk, write, or link to about "cdn" support for
sub-200ms streaming. Like is there a "varnish for WebRTC" or something like it
(really more of an ircd for video)? Such that someone could build out a live
streaming platform that can be conversationally interactive and yet not
require a 1:1 back to origin or mesh connection setup.

~~~
GeneticGenesis
As far as I'm aware there aren't any public CDNs that support the sub-200ms
approaches which don't involve buying a full solution from that vendor.

* Limelight's solution is only really accessible through their low latency streaming products provided by Red5.

* Fastly and Akamai are both chasing Ultra low latency through chunked CMAF delivery.

On a slightly smaller scale, Cloudflare supports websockets, so you could use
a protocol like the one Wowza are using (WOWZ).
[https://blog.cloudflare.com/cloudflare-now-supports-
websocke...](https://blog.cloudflare.com/cloudflare-now-supports-websockets/)

If you're interested in a toolkit to build out a webRTC style CDN edge, I
think the best place to start would probably be with Gstreamer's new WebRTC
tooling
[https://opensource.com/article/19/1/gstreamer](https://opensource.com/article/19/1/gstreamer)

~~~
newman314
IIRC BitGravity had a focus on the streaming video CDN space when I looked a
few years ago. Not clear if they have any special sauce in this area.

------
lowestlatency
Would any of the experts from this thread be able to help out with on a fun
project with low-latency video?

I'm trying to get video from a raspi + webcam attached to a drone car that's
controlled from a web app. Eventually, I'd like to use it to enable visitors
from around the world drive around in my apartment and maybe play treasure
hunt/escape room kind of games.

In order for internet-folk to operate the drone without frustration I need the
lowest latency video streaming I can get. Currently using an mJPEG stream but
it has no audio and doesn't take advantage of the on-board h.264 encoding of
the webcam (logitech c920). Car operation and the web app is already done.

lowlatencystreaming@protonmail.com

~~~
yodon
You probably need to change the problem. Instead of doing real time control of
the steering wheel and speed you probably need to let people click where they
want to go, and then let them watch the car drive there before the make their
next click. Switch to a turn-based game feel and use the driving time and the
decision making time to hide the latency.

------
vanwalj
Not really the subject here, but I don't understand why video CDN providers
don't provide peer to peer solutions yet, such as Streamroot or Peer5

~~~
monocasa
I know the ISPs are against it. To the point that Netflix essientially
threatened to switch to a p2p model if the Comcast peering fiasco wasn't
solved amicably.

[https://arstechnica.com/information-
technology/2014/04/netfl...](https://arstechnica.com/information-
technology/2014/04/netflix-researching-large-scale-peer-to-peer-technology-
for-streaming/)

~~~
jakecopp
Why are the ISPs against this?

------
devwastaken
Udp websockets would solve this problem pretty quick, but we're stuck with
webrtc which has few open source servers and none that perform as well as they
should. It's effectively proprietary, services like discord build their own
server software to handle webrtc at scale, and good luck building that
yourself.

------
jcrawfordor
Last night I was watching the State of the Union via C-SPAN's YouTube stream.
My husband walked into the living room, having just driven back from an
errand, and jokingly recited the next several sentences along with Trump.

He had been listening to the local public radio station in the car, and by
comparing the TV to the stereo's FM receiver I was surprised to find that the
youtube stream was about a full minute behind the radio. I had expected that
YouTube would lag behind traditional media that tend to be using things like
dedicated satellite relay capacity to spread live events to broadcasters, but
I was very surprised at how large the difference was - one that seems to
notice in today's world of people commenting on public events on real-time
media like Twitter.

Indeed, watching Twitter I could see responses to parts of the speech that I
hadn't seen yet. Must really be interesting for sporting events where a social
media post about play could "spoil" the game.

~~~
puzzle
That's interesting. For the London 2012 Olympic Games, YT had the rights to
live stream in tens of countries in Asia and Africa, basically everywhere
where the Internet rights hadn't been bought by broadcasters. For latency and
reliability, they ran new fiber from 30 Rock (NBC) to a Google POP. Unrelated,
Google Fiber also got a license for antennas in the Iowa datacenter:
[https://www.google.com/about/datacenters/gallery/#/places/1](https://www.google.com/about/datacenters/gallery/#/places/1)

In this case, though, it's not YouTube pulling streams from the source, it's
C-SPAN pushing them. I wonder what kind of setup is in place there.

~~~
jcrawfordor
It would have been interesting to compare the different live streams for
latency since at least a half dozen news organizations were putting SOTU on
YouTube - an experiment for next time.

------
baybal2
ProTIP: multicast

Been working like magic since the dawn of times. One downside, if you are not
an ISP - bad luck...

~~~
GeneticGenesis
Great point!

The BBC actually did some really interesting work last year on using MPEG-DASH
with Multicast with a demo at IBC, the research can be found here:
[https://www.bbc.co.uk/rd/projects/dynamic-adaptive-
streaming...](https://www.bbc.co.uk/rd/projects/dynamic-adaptive-streaming-ip-
multicast-dasm)

~~~
bitbang
Nice! I had been wondering if the move to HTTP-3 over UDP would open up
possibilities for media over a multicast HTTP transport. Looks like somebody's
already doing just that.

Only downside is that multicast sucks over wifi when many devices (like in a
corporate env) are all trying to view a multicast stream. The time slice
allotted to multicast is way too small to handle it. I really wish there was a
spec created for wifi simplex connections where a channel could be reserved
for broadcast with one signal to be consumed by many receiver devices.

------
oconnore
For real time communication— I’m not sure why people find high latency to be
so difficult. You just pretend everybody is taking a moment to think. Virtual
“talking sticks”, or radio etiquette is not hard to add either.

What is difficult is audio quality issues. I wish voice and video software had
a way to prioritize quality at the expense of latency (i.e. never compromise
on delivering quality voice, but pause input or fast forward through quiet
segments as needed to stay roughly in synch)

~~~
Obi_Juan_Kenobi
For what audience?

A HAM radio operator is very aware of the technical process going on, the
proper etiquette, etc. It's pretty easy to adapt in that situation.

A random Twitch viewer has likely never thought about broadcast latency
before, and has no real reason to. It's just an unintuitive experience when
chat reacts to things that happened many seconds ago.

Voice chat is probably the worst because the network and software is
notoriously unreliable. Is it latency, is something broken, did they just not
hear me? Are we talking over each other, something that can easily happen in
casual conversation vs. a radio transmission? That mental overhead is
constantly there. Imagine trying to explain to your mother that she should end
all her statements with "over" and confirm whenever she hears something. Are
you really not sure why people find this difficult, or are you just so proud
of your own technical knowledge that you've lost any kind of reasonable
perspective on the issue?

Lower latency is an unambiguous improvement in every regard; of course people
care about it.

~~~
oconnore
How on earth did this turn into an ad hominem about my technical pride?
Because I’m ok with slowing down the pace of a conversation?

~~~
FooHentai
No, but because you chose to open with:

>I’m not sure why people find high latency to be so difficult.

Phrasing it that way makes it likely you will be interpreted as believing
you're effortlessly better than everyone else at the subject at hand.

