Hacker News new | past | comments | ask | show | jobs | submit | bobdvb's comments login

Even Amazon doesn't have it perfected for Live.

Live sports and VOD movies/TV are very different beasts.


I am 2, I absolutely will get argued with by people who think they know better.

I'm also not going to criticise my peers because they could recognise me and I might want to work with them one day.


Yeah, try dealing with many frontends with mixed HTTP and HTTPS, it's a nightmare and won't always work. Additionally, you want security on content delivery for revenue protection reasons. The way you've massively over simplified the BSD work shows that you perhaps didn't understand what they did and why hardware offload is a good thing?

Subtitles are also complicated because you have to deal with different media player frameworks on the +40 different players you deal with. Getting those players, which you may not own, to recognise multiple sub tracks can be a PITA.

Things look simple to a junior developer, but those experience in building streaming platforms at scale know there are dragons when you get into the implementation. Sometimes developers and architects do over complicate things, but smart leaders avoid writing code, so its an assumption to say things are being made over complicated.


> you perhaps didn't understand what they did

I read and understood their entire technical whitepaper. I get the what, I'm just saying that the why might not make as much sense as you might assume.

> +40 different players you deal with

They own the clients. They wrote the apps themselves. This is Netflix code reading data from Netflix servers. Even if there are third-party clients (wat!?), that doesn't explain why none of Netflix's home-grown clients support more than 5 subtitle languages.

> Getting those players, which you may not own, to recognise multiple sub tracks can be a PITA.

This is a core part of the service, which everyone else has figured out. Apple TV for example has dozens of subtitle languages.[1]

With all due respect: Read what you just wrote. You're saying that an organisation that has the engineering prowess to stream at 200 Gbps per edge box and also handles terabytes of diagnostic log ingestion per hour can't somehow engineer the distribution of 40 KB text files!?

I can't even begin to outline the myriad ways in which these excuses are patent nonsense.

These are children playing with the fun toys, totally ignoring like... 1/3rd of the viewing experience. As far as the users are concerned, there's nothing else of consequence other than the video, audio, and text that they see on the screen.

"Nah, don't worry about the last one, that only affects non-English speakers or the deaf, we only care about DEI for internal hires, not customers."

[1] Just to clarify: I'm asking for there to be an option to select one language at a time from all available languages, not showing multiple languages at once, which is a tiny bit harder. But... apparently not that hard, because I have two different free, open-source video players on my PC that can do this so I can have my spouse get "full" subtitles in a foreign language while I see the "auto" English subtitles pop up in a different colour when appropriate. With Netflix I have to keep toggling between her language and my language every time some foreign non-English thing is said. Netflix is worth $362B, apparently, but hasn't figured out something put together by poor Eastern European hobbyists in their spare time.


See, you're confused because you think that the media player is owned by Netflix.

The browser gives you a certain level of control on computers, although you have to deal with the oddities of Safari, but when you go to smart TVs it's the wild west. Netflix does provide their tested framework to TV vendors but it's still not easy, because media playback often requires hardware acceleration, but the rendering framework isn't standard.

Developing for set-top boxes, multiple generations of phones, and smart TVs comes with all sorts of oddities. You think it's easy because you haven't done it.


Live streams have different buffering logic to video on demand. Customers watching sports will get very upset if there is a long buffer, but for a VOD playback you don't care how big the buffer is. Segment sizes are short for live and long for VOD because you need to adapt faster and keep buffers small for Live, but longer download segments are better for buffering.

Sorry, yeah, for some stupid reason I was not thinking about live streams.

Netflix has done massive amounts of work on BSD to improve it's network throughput, that's part of them enabling their file delivery from their CDN appliances. https://people.freebsd.org/~gallatin/talks/euro2022.pdf

They've also contributed significantly to open source tools for video processing, one of the biggest things that stands out is probably their VMAF tool for quantifying perceptual quality in video. It's probably the best open source tool for measuring video quality out there right now.

It's also absolutely true that in any streaming service, the orchestration, account management, billing and catalogue components are waaaay more complex than actually delivering video on-demand. To counter one thing you've said: mouse movement... most viewing of premium content isn't done on web or even mobile devices. Most viewing time of paid content is done on a TV, where you're not measuring focus. But that's just a piece of trivia.

As you said, you just don't like them, but they've done a lot for the open source community and that should be understood.


Yeah I stand corrected. Video being one of the highest entropy types of data probably means they face state of the art throughput challenges. Which are inherently tied to cost and monetization.

That said, free apps like tiktok and youtube probably face higher throughput, so the user-pays model probably means netflix is at the state of the art at high volume quality (both app experience and content) rather than sheer volume low quality or premium quality low volume markets.

I mean serving millions of customers at 8 bucks per month. Which is not quite like serving billions.


That's why you put your services behind a CDN, even if it's not cacheable traffic. Then you can rate limit what's coming to you.

With the cloud, that DDoS can bankrupt you by causing you unconstrained bills instead.


Oh definitely. I would've been more clear - I meant: you still can't stop there and you'll need a third-party to take the traffic with either solution.


Almost nothing supports HDMI Ethernet, it is the use of extra signal path on the cable to provide an ethernet link between two devices. Both devices have to support the extra signals and one of them has to be able to route, so it's basically someone's bright idea that barely has any support.

2) I've designed a system like this for a TV rental company, although legally a general retail TV manufacturer wouldn't want to do this because it causes reputational damage.

3) There's material cost for the modem and then there's a subscription cost for the connectivity. Either of which would reduce their profitability.

I spent a decade working in consumer electronics, working with all the major brands you know well, many of the white companies who make the components and the ODMs who make the boxes that will get a brand stamped on them by whoever is buying it.

Ultimately the TV business is barely profitable, most big brands sell TVs as something of a loss leader so that they can sustain their brand name. You spend each night with a Samsung, LG or Sony remote in hand looking at their product? Then they're winning in their eyes. Also because of the relatively high value of the TV it sustains their overall turnover without actually contributing to profitability. When a manufacturer launches a new TV they get about 8 months to make a profit on it, after that they're probably losing money because of downward pressure by retailers to drop the price. That's driven by consumer demand for cheaper rather than better products by the way, consumers have some responsibility for the state of the market.

The Smart apps systems cost the TV manufacturer, they have to supply the servers and infrastructure. They may make a small commission if a customer signs up to a streaming service on their device, but otherwise your general use of Smart technology costs them money every day.

Ultimately, most TV manufacturers have zero interest in spying on you. LG's biggest blunders can all be traced back to a lack of care and due diligence in their handling of data. Most of the time the 'mass data collection' is just accidental, someone in the development team thought it would be a good idea to collect data and some researcher is horrified by how much data gets sent back. Sometimes, someone gets the idea that viewing data could be used to put ads on the product, but ultimately they're not interested in what you watch, they're interested in grouping you into an advertising bucket so they can suggest you watch another movie with a Hemsworth in it.

I'm not saying that there shouldn't be oversight, and that these companies don't do stupid things for money, but ultimately there's never malice, or a desire to spy. Most of the overreach is incidental to the overall goal.

If someone doesn't want to use smart TV tech, then I'd advise them to not connect the TV to the network. There are set-top boxes out there that can do the job easily enough, and some of them might not even spy on you. One thing to remember is that many Android boxes you buy online, especially the "IPTV" ones, are riddled with malware. So don't think that by disabling Samsung and going to Kodi, you're making yourself safer.


> Ultimately, most TV manufacturers have zero interest in spying on you.

Then where did ACR come from, and why do more and more TVs ship with it nowadays?


That's true, although the reality is that the capacity is much more limited in many providers local infrastructure than you'd like to think. Those 1000 users will only get 10G at best, and remember that both in Cable and FTTH the spectrum allocation on the local segment is asymmetric.

You have a finite downlink capacity and a finite uplink capacity, users are not just competing for the same time on the wire, they're competing for spectrum. If everyone was on Ethernet to the home then you'd be right, but FTTH and Cable are in physically contended spectrum in the cabinet/cable itself. Proper fibre ethernet costs more per user than FTTH/Cable because each user needs a port on a switch, instead of using TDMA and everyone being on the same wire at the other end.


There's not really any real tension between "you should know their stated rates are bullshit" and "they should accurately describe the service they are willing and able to provide".

One is a stupid way to run a society and the other isn't.


You're never getting 1G 100% of the time, you're getting a target of 1G and statistically if you do a speed test you probably won't be doing that at the same time as everyone else.

A little knowledge can be dangerous, that's not patronising, it's that accurately describing your network topology to all customers is hard and easy to misunderstand. Some segments of the network will be heavily contended, and others will be under utilised. Being heavily contended might be undesirable but it's pragmatically going to happen.

Where it breaks down with customers, is where the segment is over-contended to the point where they consistently can't meet the product description. But that's not a service description issue, it's an investment problem and if it's not being dealt across the board by the provider, then that provider will be crap.

The bigger issue in this case is a lack of effective competition which drives vendors to have a decent quality of service. Being more transparent won't help with an under provisioned network if you have no choice. In markets with poor competition, poor service provision and capacity usually follows.


Nvidia's Moat is CUDA, it's become so entrenched in development stacks that people have inertia in moving to anything else. I've been in situations where I've wanted to use other GPUs and everyone's said to be "but surely you want Nvidia?!" because it's the de facto now. AMD, Intel and others haven't done enough to break the NVidia API monopoly in the past decade. Intel's done stuff with One and AMD sponsored that CUDA converter thing. But there's not been enough effort to break through even when that's fundamentally the blocker.

If AMD and Intel want to compete, they need to think about the early years of computer software development. When people would say "Oh, we're not going to port our software for Amiga, PCs are now more popular." There is some portability in games for new GPUs, but it's CUDA in scientific and enterprise where they should have broken the CUDA stranglehold.


It's really not.

The issue is that Disney and others pay per TB/PB delivered. So if they want to deliver double the bandwidth they better make money doing it. Disney+ is barely breakeven and a year ago was burning nearly $600m in debt.

Netflix has their own CDN nodes inside ISPs which helps them keep costs down, but then we now have ISPs like DTag demanding money from content platforms for peering bandwidth and lobbying legislators to tax content delivery to pay for their networks.

Add to that, most customers don't care enough about 4K* to actually pay what it costs to make and deliver. There's zero incentive for content platforms to increase spending to deliver high bandwidth streams.

* HDR has way more impact than UHD in customer testing.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: