Hacker News new | past | comments | ask | show | jobs | submit login

There's no shared wall clock between the server and client with HTTP-based streaming. There's also no guarantee the client's stream will play continuously or even hit the same edge server for two individual segments. That's state an edge server needs to maintain and even share between nodes. It would be different for every client and every stream served from that node.

For streaming you actually want the client to have a buffer past the play head. If the client can buffer the whole stream it makes sense to let them in many cases. The client buffers the whole stream and then leaves your infrastructure alone even if they skip around or pause the content for a long time. The only limits that really make sense are individual connection bandwidth limits and overall connection limits.

The whole point of HTTP-based streaming is to minimize the amount of work required on the server and push more capability to the client. It's meant to allow servers to be dumb and stateless. The more state you add, even if it's negligible per client, ends up being a lot of state in aggregate. If a system meant edge servers could handle 1% less traffic that means server costs increase by 1%. Unless those ones of ad impressions skipped by youtube-dl users come anywhere close to 1% of ad revenue it's pointless for Google to bother.




Applications are open for YC Winter 2022

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: