Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Why don't many video hosting companies adopt PeerTube or P2P?
179 points by arunharidas on March 3, 2022 | hide | past | favorite | 106 comments
I recently found the peertube and its technology is very nice. You could deliver video using p2p torrent like streaming if there are more than 1 user watching it. For me it looks like the hosting provider gets a huge savings in terms of bandwidth. But I don't see many companies making use of this technology.

I think Bitchute is using p2p to deliver videos, or is it, really?




I have some experience with large-scale CDN I've cofounded some years ago. The short answer is - the perceived benefits you'd receive doesn't match the decrease in quality of service. So the economics doesn't work. Content distribution is widely skewed towards 1% of content makes 90% of traffic nodes so the cases that p2p may work for very popular content but the long tail is not easily cached/served. Try to watch/download a torrent file with small number of seeders and you'll get the experience you will be providing for the end user. Also internet networking is tricky and connectivity in different parts of the world can differ widely. South America or Asia have almost none of the peering hubs infrastructure that is available in Europe. (I've cofounded a CDN company and have some knowledge of the market)


This is more or less exactly why Spotify dropped P2P from its platform back in 2014.

Good summary here:

https://techcrunch.com/2014/04/17/spotify-removes-peer-to-pe...

"This is when the company took advantage of cached songs. Before today’s change, if you streamed a popular song for the first time, the client would download the song from other users, using peer-to-peer. All of this happened in the background, but it greatly contributed to making the overall user experience snappier."

"Yet, now that the company has many servers, using peer-to-peer in addition to direct downloads actually adds a bit of overhead. Moreover, the company has to maintain the peer-to-peer code base, and update it with each new version."


> the company has to maintain the peer-to-peer code base, and update it with each new version.

Really, that's (part of) their excuse? A bit lame, that. :|


What, you don’t think code has maintenance costs associated with it, just because it’s cool?


I might have agreed with you a few years ago but these days I see Spotify's perspective more clearly. Today I see code as a liability. Like the parts in my car it is eventually going to break in some strange way and cause me a headache. The only real solution is to never drive a car, but that's almost impossible if you want to get around. Same with code, it's a vehicle that gets a business to a destination, which is nice: bit it can also be quite expensive when things go wrong. Efficient engineering teams balance this inherent risk against the needs of the business.


Every line of code has a cost. The question you have to ask yourself is whether that cost is worth the return on your investment?

No code is less expensive than no code.


So your origin needs to be able to handle the tail and P2P can handle the popular stuff no? It isn't all or nothing. P2P could absolutely be used to peak shave. CDNs also have cold start issues with long tail.

Time to first byte is pretty good, https://webtorrent.io/

To the OP, you should research some other options, are you sure that you are asking the right question?


Streaming video is not like static files from a torrent. The term "popular content" is a bit of a misnomer. Nominally with streaming a single uploaded video will be encoded into a number of different versions at different frame sizes and bitrates packaged into individual stream segments. So while a particular video URL might be popular, the actual video segments served to viewers will depend heavily on the configuration/environment of all the viewing devices.

So a FHD stream for a video could be super popular while a 480p version might not have any viewers. When some 480p viewers load the stream they don't get any benefit from the P2P/cache popularity of the FHD stream.

This is compounded by the fact that for video time to first byte doesn't matter very much. A decoder can't do anything with a single byte or even a partial stream segment. It's the time to locally buffer a decodable stream segment and a steady reception of segments as the play head reaches their time stamp.

If a segment gets held up (as happens with P2P) it can't be decoded so you either pause playback or drop frames (if you've got later segments available). You want the media segments to cover at minimum a single GOP (Group of Pictures) which is a relatively large I-frame and the following smaller P and B frames. If you were to tune the video segmentation such that each segment was some tiny time slice it would balloon the bitrate since you need way more I-frames. Super long GOPs don't help either because segment drops end up ruining playback until a new I-frame is received.

Streaming video is hard enough with a reliable CDN, adding P2P makes much harder and orders of magnitude less reliable.


> So your origin needs to be able to handle the tail and P2P can handle the popular stuff no? It isn't all or nothing. P2P could absolutely be used to peak shave. CDNs also have cold start issues with long tail.

No, it'll be genuinely be worse, as you can't really have efficient P2P today.

Let's start with the points; I'm not talking about Europe or US today. P2P or CDN is a coin flip there, and I'll agree that P2P is as reliable as a CDN. I'll also agree that theoretically P2P and CDN will be a coin flip (performance-wise, I'm leaving monetary considerations in this discussion).

Those points break down quickly in Latin America, Asia and Africa. I'll talk about Asia as it is where I'm more familiar of, but the points apply also in Africa and Latin America. Outside of Jio and actual hosters/colos (meaning nearly all of residential and mobile connections), IPv6 in Asia is actually more unreliable than you think (it should not be, but that's peering fights for you) and IPv4 is CGNATTed heavily so P2P is no dice. Even if somehow there's a unique IPv4 for every device, internet connections there is very top-down, unlike with Europe's mesh-like connectivity.

How bad? It'll be better to route Telkom Indonesia's connections to America than to another hoster located in Indonesia or Singapore (nearest regional hub), unless you're Akamai which have agreed to buy colocation space inside Telin just to have good connectivity. And before you comment, OVH in Singapore has the precise issues we're discussing (https://lowendtalk.com/discussion/172659/ovh-routing-issues-...). Thus, it is miles better to use your time to build good CDNs with good connections (or spend money to Akamai) than bothering with the issues you'll face with P2P. Moreover, you can concentrate your money into buying a better data link (either by renting/IRU a dedicated wavelength or even fiber pairs) than relying on a best-effort service.

Funnily, even bittorent downloads in practice seems to be concentrated to a few seedboxens - even for purely legal downloads like GIMP.

Postscript: technically P2P and CDN would be the best for reliability, but you're spending the time to do it (instead dealing with a single thing) and deal with the possible fallout (like Windows Updates before Microsoft decided to limit P2P to local networks due to bandwidth and privacy concerns). Speaking of bandwidth concerns, I promised myself to limit monetary concerns just to point out that's it's not good in actual deployment, but I forgot data caps (not just a problem in Africa but in the US too)! It'll be a disaster when P2P is not voluntary because you just wasted their precious money!


So, your origin needs to handle the parts of the world with bad internet infrastructure, but can use P2P in Europe and the US, then?


Besides data caps on residential broadband, a non-trivial number of clients will be on mobile and have effectively zero ability to reseed content. To enable streaming from mobile devices requires a lot of server infrastructure to support hole poking and reflection.

Even with residential broadband without worries of data caps, there's still issues of firewalls and terrible upload speeds. Hole poking in residential firewalls like on mobile requires third party infrastructure. Most end users are not readily able to forward ports on their home routers. It's also not uncommon to see 10:1 downstream to upstream ratios. There's plenty of connections that couldn't manage to reliably upload a single FHD stream to a single client. It would take a hundred seeders with Comcastic upload speeds of 10Mbps to equal a single cheap VPS on a 1Gbps connection.


US is out because data caps.

You can in Europe, but it is worth the dev time? It doesn't matter when monetary costs are second to principles, but it be wrong to think that most companies are spending their time fancying P2P.


> Try to watch/download a torrent file with small number of seeders and you'll get the experience you will be providing for the end user.

I always wondered why not combine the ways the media is served into a hybrid of P2P with the classic way by just using P2P (e.g. BitTorrent) and also seeding yourself (as the hoster). Just run a seedbox of the capacity you want/can and let the audience help you by handling a share of the load they can. Wouldn't this save you some bandwidth while keeping the same service quality?


> let the audience help you by handling a share of the load

So why should we help these for-profit corporations? We're supposed to just donate our bandwidth to them? Are we getting a discount in exchange or are they just gonna pocket their bandwidth cost savings at our expense?

Seeding internet archive data is one thing. Letting corporations invisibly take advantage of resources we pay for is a completely different matter.


Pass some of the savings in to the customer via lower prices and it’s a win win.

Alternatively, enabling P2P gives faster downloads. That’s rarely noticeable with streaming but people often want to download stuff for offline viewing.


> Pass some of the savings in to the customer via lower prices

Does that ever happen? They'll probably keep prices exactly the same.


If they have lower costs, some companies will exist that otherwise couldn't. Then you get options that wouldn't exist and more competition drives down prices.


It happens constantly in IT, look at CDN prices per GB over time for example.

It’s only the extreme prevalence of monopolies and free services that confuses the issue.


Archive.org seems to do something like that with its downloads via torrent: it appears as a web-seed for the torrents, and I usually snatch the files from those seeds faster than DHT seeds usually show up (which kinda undermines the idea, though).

Frankly, I never saw any other seeds on Archive's torrents—but one of those I snatched has a ratio of 3 now, so presumably it does work as intended.


Roku can probably provide this service given they control hardware and can provide SDK for the apps to do it.


This is what peertube does. That's why you can watch videos normally despite most not being popular.


That sounds like a perfect match for internet content which is often viral and temporary peaks. Ideally in a P2P world creators would care to have at least 1 node running which would serve the long tail.


I wonder if an asymmetric scheme would work best, even though it may be unbalanced? For example, for any one popular video you watch, while you watch it, you also have to download and seed a small part of an unpopular video, even if you don't see it?

Maybe bandwidth and storage caps would work better. You seed X MB, even if you don't need the full X yourself.


İ also have experience building a CDN, tho not for video, and this sound right.

I wonder if this might be improved by an appointment system, to stream popular titles at pre-planned times so that people could benefit from others watching at the same time.

It would be like watching a broadcast television program during prime time.

I guess this has been tried before but also never heard of it


The clients that do plan in advance are more sensitive about the quality of service rather than price. There are QoS SLA contractually required and if they can't be met clients will leave.


P2P with fallback to centralized would solve a lot of it.

Unfortunately P2P projects always try to be replacements for centralized rather than supplements, and almost none of them aside from BitTorrent are what I'd call great.


The other part is that CDNs result in less total network utilization. They limit most bulky traffic to the last miles.


It's not always so straightforward. You decrease your traffic costs, but you'll incur capex (servers, switches, connectivity) or operation costs (renting servers). Additionally outside EU network contracts can be a hassle to sign because of market consolidation or outright oligopoly. So you'd need to commit capital to get bandwidth. It's an interesting optimisation problem and sometimes you can route your traffic differently based on SNMP data, price per Mbps and route performance. The higher you pay, the better connectivity you get. But off-peak times you can route traffic via cheaper provider.


I don't know about other countries, but in China, almost all major video hosting companies (including both PGC and UGC contents) are using P2P technologies to save bandwidth.

Live streaming p2p is the easy one. Live streaming content is very skewed: top 1% streams can cover majority of the bandwidth, the top room can easily have tens of thousands of users, so p2p helps a lot.

Video-on-demand is harder. But now there's also many "seed boxes" out in the market: it's basically a custom home router with a big disk, users buy it and put it in their home as a regular wireless router, but in the background it would automatically connect to server, cache videos and serve the videos to other peers. The user may get some bonus from it (mostly digital points). Essentially, these companies are buying users' home internet as CDN edge nodes.

But in either way, P2P is used to save some bandwidth (cost), but the performance would almost always be worse. There would always be traditional CDN as a fallback.

Some possible reasons why it's more popular in china: 1. there's lots of people here; 2. users don't care about their privacy much

(I worked on this area in one of the largest video hosting company in china)


I'll add 3. connections tend to be symmetric, and even guaranteed for fixed connections (well technically mobile internet are asymmetric by design, but it doesn't matter much because uplink is still reasonably fast) and 4. government-mandated peerings, before 2017 it'll be simply impossible because Telecom and Unicom (not sure about Mobile) don't have good peering back then with each other.


Where I live in the US we have ADSL and to make peer-to-peer viable at all I have to put an upload cap on, otherwise the uplink gets saturated and downloads start having problems because the ACK packets get delayed or lost.

It drives me absolutely up the wall that some torrent networks insist that you have a 1-1 ratio because it is close to impossible for me and there is no point because often the network has so many seeds…. The fact that I can complete a transfer uploading just 5% of what I download proves my upload isn’t needed.


> some torrent networks insist that you have a 1-1 ratio

Yeah, it sucks. Ratio economies disproportionately favor users with good links. At the same time, it ignores the truly valuable user contribution which is decentralized redundant storage. Trackers should be trying to maximize the amount of copies available in the swarm, not bandwidth which literally doesn't matter past a certain point. What good is a torrent with zero seeds?


You should look into enabling fq_codel/cake or similar algorithms on your router if you have bufferbloat issues. You can also rent a cheap seedbox with a symmetrical connection if you are having issues seeding. Also in my experience most of the best trackers have a bonus point system that rewards long term seeding over ratio. BTN for example is completely ratioless.


Funnily, all Comcast-provided (look who sponsors bufferbloat.net :P) and majority of Xyzel routers implement CoDel already. It's just that even with CoDel, it just sucks.


That's fascinating. How popular are seed boxes? And does the consumer get them from their internet provider directly?


There are several companies doing this business specifically: users buy seek boxes from them, and they sell the bandwidth to us video streaming companies. They are just like CDN providers, but instead of buying bandwidth and server from the data center, they “buy” it from home users. I don’t think ISPs would like this, because it affects the business in their data center side.

IIRC the total usable bandwidth is about 10Tbps


> Why aren't we see many video hosting companies adopt peertube or P2P?

Case Study: Bit Torrent [0]

Bit torrent is a brilliant idea - allows everyone who has part(s) of a file to contribute to the pool of availability so that any given central server/mirror isn't overwhelmed

And it has its place (eg in file sharing)

But for streaming? Not so much

Say you're getting the "next chunk" (whatever 'chunksize' is in this context) from me, and I go offline (it's the end of my day, need to reboot for updates, any of myriad reasons). Where does the next bit of the video come from in a way that is seamless to the viewer?

That is the fundamental problem of shared/p2p streaming protocols - everytime the host of the current/next blob o' data goes offline, you need to waste time finding a replacement

Even if the replacement can be found "quickly", how do you ensure they don't go offline in the middle of streaming? How do you ensure "enough" copies of every chunk are distributed that it, effectively, 'doesn't matter' how many go offline [at once], it will still stream?

--------------

[0] https://en.wikipedia.org/wiki/BitTorrent


You get around that with long buffers. Like...several minutes long, rather than several seconds.

You still have a good chance of buffer underruns when you start a video, but that would likely get mitigated by the fact that the data for the start of the video would likely end up highly duplicated on the network.

The bigger issue is seeking. Jumping to the middle of a video could take many seconds before it played. The UX would be awful.


>You get around that with long buffers. Like...several minutes long, rather than several seconds.

That sounds horrible - now you've got to have something buffering [potentially] 100s of MBs (depending on quality) over lousy residential upload speeds?


Netflix in 4K is up to 25 Mbps, which would put a 5 minute buffer at about 1 GB.

That's perfectly reasonable on any desktop/laptop, and likely fine on any decent phone or tablet.

Still doesn't solve the instant seeking problem, of course.


It would take at least three seeds at Comcastic upload speeds (~10Mbps) to handle a single client at 25Mbps with a small buffer. You'd need a multiple of seeds proportionate to the ratio of stream rate to buffer size. Not ever seeder will have every bit of content, or even every bit of the top 1% of content for a large corpus. Because a large portion of clients will be leechers unable to reseed (mobile, shitty residential routers, etc) the network would need a huge number of seeders with even modest popularity.


> everytime the host of the current/next blob o' data goes offline, you need to waste time finding a replacement

Ther is an algorithm in place where the rarest is downloaded. Also a priority list of older peers is implemented. Some protocol also request the chunk from muliple peers (kademlia).


>Even if the replacement can be found "quickly", how do you ensure they don't go offline in the middle of streaming? How do you ensure "enough" copies of every chunk are distributed that it, effectively, 'doesn't matter' how many go offline [at once], it will still stream?

You don't need to _ensure_ all that stuff, in the strictest sense of the word. Network hiccups happen, nodes go down. As long as the frequency of interruptions scales inversely with the number of nodes in the network, once you're on a big enough network everything will work smoothly.


Wouldn't the "next chunk" problem be mitigated greatly by just maintaining larger buffer sizes? I know most client devices (such as Smart TVs and HDMI Dongles) don't have the memory capacity for it, but a 10 second long buffer would effectively mask a lot of those issues.

That being said, most residential internet packages have low upload speeds, which is an issue


A 1Mbps stream (high quality 480p or potato quality 720p) would need a 10Mbps burst to fill a 10 second buffer or wait some time before playback. It would need somewhere north of 1Mbps to keep the buffer full accounting for dropped packets, jitter, and seeds dropping out. If the stream drops below 1Mbps for longer than ten seconds then playback stops.

It all makes for a shitty user experience.


If I recall correctly the famous 1% rule (not that one) but rather[1] was the topic of a recent article (which I cannot now find but am hoping will randomly come into my memory at 2 AM and/or someone will correct me on) where the author gave some pretty convincing evidence that this kind of behaviour (the tendancy for _most_ people on the internet, 90%, to only consume content) also extends to open source ecosystems. I believe Matrix was given as one example.

Essentially no one wants to host and run the infrastructure for these things, only the "diehards" do. This is akin to how only the "diehards" produce content on Wikipedia, or YouTube. Remember, this is relative to the volume of people who only consume that content.

It's the same in P2P e.g. Bittorrent. The count of seeders (those who have the content and are uploading it to peers who need to download it) is almost always lower than leechers (those who are only downloading).

This is exacerbated with Peertube because video files are big, require a lot of bandwidth, and someone has to pay for that. If there is a set of diehards who love running Matrix or IRC (think other smaller networks not just Libera) or what have you then how many of that already small set can meaningfully afford to run a Peertube instance capable of serving a momentus volume of video traffic?

I wish it wasn't so but it is.

[1]: https://en.wikipedia.org/wiki/1%25_rule_(Internet_culture)


>It's the same in P2P e.g. Bittorrent. The count of seeders (those who have the content and are uploading it to peers who need to download it) is almost always lower than leechers (those who are only downloading).

That's (at least partially) solved with incentives. Plenty of private torrent sites require you to keep a good ratio of download/upload to use them. As a kid I was definitely not a die-hard but still seeded to have access.


Yes, but that model only really works for piracy sites, and only ones with closed access[0], where you can maintain an internal ledger of who seeded what. On an actual streaming service where you have to pay for your TV shows or movies, the cost of video distribution will be dwarfed by the cost to license or produce that video. If you actually tried this on a premium video site you'd be charging $15/mo and then giving most people back a few pennies for their bandwidth.

Free-of-charge video might seem like a better fit, but that market was totally cornered by Google and YouTube; who basically have as much bandwidth as they could ever want. Yes, you can get around that with PeerTube, but there are so many other entirely unrelated advantages YouTube has over everyone else that this doesn't really confer much of a benefit.

[0] Interestingly, the FTP topsite scene would stick their nose up at even private BitTorrent trackers as not closed enough.


Yeah I remember those. One I used to use had a donation scheme where if you couldn't seed or didn't want to you could donate 10 Euros and 100 GiB of seed traffic would be applied to your account (just a number in a database not actual seeding) to keep that ratio up.

These people were _the_ tracker for the content in question so they were seeding a lot of content themselves hence the donations were a proxy.


>For me it looks like the hosting provider gets a huge savings in terms of bandwidth.

The constraint on wide adoption is literally the quantity of "peers" in both Peertube and peer-2-peer.

The underlying human incentives are not there for most people to host a peer node. This limitation applies to all types of digital domains including videos, or files (IPFS or bittorrent), or crypto (Bitcoin/Ethereum peer node).

Let's follow the trail of incentives for one Peertube node:

- see list of Peertube instances: https://joinpeertube.org/instances#instances-list

- I pick the 2nd one on the list: https://the.jokertv.eu/about/instance

- in that instance's "About" page, it says: How we will pay for keeping our instance running -- personal funds. If you feel like donating, put up your own instance instead and host some creators you find interesting.

Companies can't build a Youtube/TikTok competitor based on examples like that. Same forces of economic incentives that limits quantity of IPFS peers which means businesses can't use it to replace Cloudflare CDN or AWS S3 buckets.

It should be understandable why most people aren't willing to spend personal funds on hosting home nodes so businesses can freeload off of p2p and save money on bandwidth.


That list of instances is randomly ordered, it isn't like biggest first or anything.

I don't really see how the fact that there is someone who hosts videos using personal funds proves that it isn't possible for a professional company to host videos using the same company. It proves Peertube does work at small scale, and says "not much" about how it would handle large scale with a real budget.


The human incentives are certainly there, it’s the copyright and IP industry that does not want this to come to pass. There was a time when you could find obscure things on torrents with seeders, but that was shut down by the IP rent seeking industry.


I think you’re overestimating how strong that inventive is. There’s a lot that I’d change about the current copyright model but it’s not just rent-seeking keeping P2P from mainstream use: most people do not have high uplink capacity and congestion is a real issue. Companies like Netflix have built a consistent user experience where video plays quickly and smoothly, and the price is low enough that most people aren’t jumping for alternatives — I would bet that if you surveyed the average person the biggest copyright reform they’d want is blocking exclusivity so they could use a single service for everything.

From a business perspective, a big problem is trust and safety. Torrents were also notorious for having mislabeled content or malware, and users are not going to use a system which serves arbitrary content from their IPs, so you have challenges keeping the experience good while building a large enough peer network to provide a competitive experience.


> I would bet that if you surveyed the average person the biggest copyright reform they’d want is blocking exclusivity so they could use a single service for everything.

This. While there's definitely piracy before internet, most don't bother because it plays on VHS/DVD. As Gabe Newell stated, the problem of piracy is a service problem.


And also lack of upload capacity for 90% of people due to the lack of symmetric fiber availability.


Isn’t that a result of the push to cable-tvize the internet? Symmetric high bandwidth internet became commonplace during the height of the P2P downloading era where I live. Because the market was there.


I presume it is just a consequence of the enormous costs to build out a fiber network that would yield uncompetitive prices compared to the internet people are willing to accept from their coaxial cable internet providers.

A combination of people do not demand it enough to pay sufficiently to get neighborhoods wired with fiber, and coaxial cable companies having a disincentive to allocate more of their wire capacity to internet bandwidth since they want to avoid becoming a dumb pipe.

The only way out is taxpayer subsidized build out of fiber operated like a utility, such as the system in Chattanooga Tennessee.


Having previously looked at peertube, the problem is it's hard to make money out of it -- because of the p2p technology, it's hard to have control of your videos, put adverts in, track users, etc.

While peertube saves one big pile of money (the streaming costs), it makes it significantly harder to make any money. Videos are expensive to make, so generally creators want a wide audience, and ways to monetise that audience.


If creators moved to PeerTube they probably could still make money with sponsored content and donations like they already do on YouTube.

Of course, because of network effects and YouTube ad revenue there is little incentive to do so except for idealists and people who get banned on YouTube.


If you can't track your users/views/etc, it's hard to sell sponsors on the idea that your video is worth sponsoring.


Peertube is the opposite of centralized video hosting; it's a federation of nodes.

A video hosting company needs to hold all the videos hostage so they can make money somehow, like via ads. Plus the ability to control content: the videos themselves (e.g. be able to censor or delete them), and other content like user accounts, comments and whatnot.

This is like asking, why doesn't Reddit just operate as a Usenet gateway with a Web UI?


> Peertube is the opposite of centralized video hosting; it's a federation of nodes.

Like a CDN? :)


While you're correct that in this way a CDN is similar to peer-federated systems, this is also a low-effort trite response that misses about 70% of the differences between the two.

The main difference being, of course, CDNs provide a delivery and storage service which you pay for, meaning the content remains yours, the circumstances under which it is viewed is under your control, and you're simply paying for them to deliver it for you. With peer-based systems, those two key points are no longer true.


The short answer is that geographical resource balancing isn't federation, even though both involve multiple hosts.


Geogprahical distribution is irrelevant if there is only a single gateway to the content.

Then the benefits of distribution are largely only realized by the owner.


> Why aren't we see many video hosting companies adopt peertube or P2P?

For the same reason we don't see many video hosting companies, period: YouTube. It eats the entire space for reasons that are entirely unrelated to the pros and cons of p2p. For hypothetical competitors that means solving their biggest issue – gaining users over YouTube – can also not be done by going p2p (but it might of course still be part of their hypothetical tech stack for other reasons).


It’s strange that Youtube is so entrenched that even Russia could not muster its own alternative.


Yep, this is a stronger factor than delivery mechanism questions.


A simple problem with adopting P2P streaming of videos is more in a logistical sphere than a technical. Often in a YouTube video and Twitch stream, there is a sponsor who is paying money for these videos. And these videos need to have some form of analytics or telemetry to see just how many people are watching. If I host the video and you host the video and Bob hosts the video, how am I going to get the view details so that I can calculate the CPM and get paid by my sponsor? The other logistical problem is that these instances just aren't where the viewers are. YouTube is a giant behemoth because they built a place where people can upload their own videos and viewers have a centralized place to watch them. It became so much simplier to tell someone to search a channel on YT than to give them a web address, install a new client (if necessary), and then pray that there are enough seeders so the viewer can watch. And to do that for all twenty (more or less) channels I subscribe to on YT? No thanks. One more logistical problem is more on the technical end of things, but it's still a problem. Let's say that we do make a good go of P2P streaming. By it's own nature, you need to store videos on the device to watch. My FireTV Stick doesn't have that much disk space. And the processor on many streaming devices aren't exactly all that powerful, so running a node in the background isn't in the cards for these devices. One could argue that we could just have a small NAS on our network that could do all the heavy lifting and just send the videos to the streaming device, but have you ever told someone who isn't technically minded that they need to do that to watch your cooking videos? I work in tech sales, and it's a nightmare to tell some people that they need to install software to setup their printers. I'm not going to tell them to set up and additional device for their streaming.

My two cents.


I am working on a point to point media tool right now and it’s super challenging to get right. There is no server. There are just peers.

Client/server is very easy, because there is a centralized common point of access that manages everything. In a serverless model you have to connect via identity that has no fixed address.


Why is a mixed model not used? Where the server exists and serves both the meta-data and acts as a seeder with support from other users? This seems to be the best of both worlds?


Because that requires creation of new original technology. Most developers struggle to deploy REST using a framework that does all the work for them.


It doesn't provide guarantees enough to give a good experience.

Basically it requires enough people with enough bandwidth to stream data to you. Not only that but those peers need to ideally be near you so that they can react quick enough to overcome packet loss.

So you have a choice as a provider, maintain a fast autoscale peer list(expensive), or force long pre-caching to make up for uncertain peer reliability(bad experience and or expensive).

P2P is not a good platform for providing realtime high bandwidth services, unless there is strong incentive to keep your peer running for as long as possible. with video, you're going to close it as soon as you've finished watching. for an hour long movie, thats fine, for a 30 second clip, its terrible.


I have thought a bit about this myself (although others here seem to know a lot more) so Ill give my 2 cents: - it already works w/o P2P, Youtube has worked pretty well for 15+ years as client-server. If you have Google scale, it wouldn't make sense not to use it - Installing software is a harder onboarding than opening a website (I think you'd need more than a browser for this?) - security implications of P2P? - the biggest one for me: Ignoring internet technology concerns, its an extremely hard distributed systems problem with Byzantine concerns, nodes coming and going at any time, where to store unpopular videos?

I hadn't even thought about the telemetry concerns (how do you sell ads efficiently?).


I've been putting some stuff on PeerTube to see how well it holds up. (Utterly boring Metaverse client content downloading performance test.[1]) I use it mostly for videos linked from forums, so there's a separation between finding and streaming. So far, no problems. The question is how long such content will stay up.

No ads. That's the big advantage over YouTube now. And it beats paying Vimeo to host obscure technical videos.

[1] https://video.hardlimit.com/w/qBGD9LF8Ua3T7gLPCkE6vw


A vast portion of these comment threads are full of people blatantly ignoring two things:

1) The US and vast swaths of the world have metered internet connections. The benefits of P2P for your bottom line will be short lived as customers abandon you for bloating their data usage without their permission.

2) Those lower costs will never be passed down to consumers by companies. I don’t know how or why you seem to think this time will be different.

I’m inclined to feel there’s some Web3 Not-Quite-Astro-Turfing going on here. The refusal to see either of those terribly obvious points feels super fake.


This has been tried. The first incarnation of BBC iPlayer was P2P, and I think there may still be a desktop version that allows you to save shows offline for 30 days that works via P2P.

The problem is that in most of the world, consumer internet connections are highly asymmetric, and people have terrible uplinks. Thus there often won't be enough bandwidth available for streaming, and furthermore those providing the bandwidth will find their general browsing experience degraded due to uplink saturation.


> The problem is that in most of the world, consumer internet connections are highly asymmetric, and people have terrible uplinks

I am wondering how did we end up with this situation. Is there a TL;DR?


Just guessing, but maybe is the natural consequence of asymmetric last-mile internet technologies like (ADSL/cable) where you _must_ allocate separate bands for up/down. This is even true for LTE I think.


This, and most users are just that, users, and not producers, so they value/pay for high download bandwidth and mostly don't care about upload. Upload bandwidth costs money, and not many will pay for it, so we get what we pay for.

This is changing over time as the aggregate upstream load from mostly passive users nevertheless becomes greater than a video upload stream, eg because people are face timing and gaming, etc.. so presumably we'll reach a tipping point where enough people have the excess up to support a growing set of always cached content. İ think this is the long game for distributed fileaystems, eg ipfs


TL;DR: It was the only way to get broadband downlink speeds over copper telephone lines.


Because companies don't need to and scale eats everything. Unfortunately, we're in such a winner-take-all environment that anyone hoping to compete with YouTube is starting at a huge loss.

If Peertube takes off (and I hope it does) it would likely be on the back of something that's expressly anti-status-quo. (Which doesn't mean it couldn't be a company per se, but I feel like would have to wear "We are the anti-Youtube" on its sleeve.)


Don't forget about TikTok


Excellent point. I'm old and I forget that Tiktok is massively reworking what "video" means to the world in general.


I haven't been able to watch a single video on PeerTube without massive buffering issues. That's why (for me).


The average streaming client device cannot store enough video to meaningfully participate in peer to peer.

The average internet connection (at least in the US) doesn't have enough upload bandwidth modulo level of service to meaningfully participate in streaming peer to peer.

Watching video on a mobile phone is probably the median streaming use case.


VUDU used to do this with their hardware box - you’d buy a movie to stream and would download it from them + peers that also had it already, and then you’d be seeding it as well.

I couldn’t tell from a quick Google if the modern VUDU service still does anything P2P, but the service is still around running on all sorts of platforms.


Not an expert on networking infrastructure. But I believe this is due to scalability bottlenecks, like Bandwidth.

P2P infrastructure is usually run on hardware designed for personal use.

I think to see real adoption of P2P technology, one has to make it work better and cheaper than the current existing cloud infrastructure.


Aside from the technical discussions there is a legal thing: If peers upload the stream they might be liable for copyright issues. Downloading often has different legal impact from uploading and judging legality of stream content is hard.


My internet is a gigabit down and like 30mbit (or maybe 15mb?) up. I really don’t want that upload bandwidth being used by some p2p thing unless it is a torrent seed I specifically allow.

I would never want some random hypothetical YouTube / peertube / Spotify users hogging my precious upload bandwidth. I need that tiny slice of upload for backups, conference calls, and gaming.

My friend, however, lived in an area that offered fully symmetrical gigabit fiber internet. Dirt cheap too ($60/mo). We’d use his Plex server all the time to stream HD or even 4K content right from a PC in his office. It was awesome!


It has nothing to do with the technical problems (bandwidth, etc). It has everything to do with the legal problems of video hosting and the legal liabilities it sets up. This problem of p2p video streaming has been successfully technically solved a dozen different times. Whenever these solutions become popular enough they are attacked legally and decline.


There are some non-trivial technical problems:

1. You need to have a complete copy of the collection _somewhere_ but P2P only benefits you if most requests can be served from copies close to the viewer. That's a problem if you have a large collection with a long tail (which is true in most cases).

2. P2P requires a fair amount of over-provisioning because most users don't have high uplink capacity, especially if you're avoiding the congestion which will cause people to stop using the service.

3. P2P nodes are unreliable & have limited capacity so you maintain a large list and update it frequently. For a popular movie, that's probably going to work out okay since most people will leave the player open for a couple of hours but for shorter or less popular content that's likely to mean that a lot of traffic is going back to the seed servers or ends up being slower going to that one guy in Moldova hosting some obscure video.

Consider how the open-source community has no legal considerations but most downloads of Linux ISOs, packages, etc. use HTTP from CDNs or mirrors because it's faster and more reliable. This problem is not impossible to solve but there's more to it than copyright and other legal considerations.


6 years ago, I used to work at a p2p video CDN company, that have since then been bought by Level 3 (now Lumen), and is now part of their product offering[1].

Using p2p to offset bandwidth cost is a really cool idea, but it doesn't come without difficulties:

- WebRTS doesn't work everywhere: for this kind of thing you really don't want to use a TURN server, and only work with true p2p. This means you can't use it for users behind a symmetric NAT.

- `libwebrtc` (Google's implementations, used by Chrome and derivatives and also by Firefox) performs very poorly when there's a big number of open connection (I don't remember why, but you couldn't expect to maintain more than a dozen of connection on a laptop before having CPU load issue and dropped frames. This is probably an implementation issue, but Chrome's team were uninterested in investigating it). This means you can only be connected to a small pool of peers at any given time.

- Probably related to the previous point, it drains a lot of battery on mobile devices.

- Adaptive Bit Rate make things complex, since the user will switch tracks at random point, meaning they will need to be grouped with a different pool of peers. (since you cannot maintain a big group of peers, from different tracks at all time).

- it doesn't works that well on VoD: for new videos gathering many people at the same time it works really well, but for the long tail of old videos you're often the only one watching it at any given time. Unless you're Youtube scale indeed.

- it works better in live streaming, since everyone is indeed watching the same thing, but to maximize p2p efficiency you have to ad some latency (to have a bigger video buffer to work with), this isn't acceptable in every situation (sport events broadcaster don't like that at all for instance).

- to work well (especially regarding ABR, and live-streaming) you need your system to be quite tightly integrated to the video player. Polyfilling XHR/fetch with your own library isn't good enough (or competitors were doing so, and their product was less efficient for that reason). And surprisingly enough, there are (or at least there were) a ton of custom video players: many companies forked dash.js or hls.js and customized it, sometimes quite heavily.

- there's a serious privacy issue: the peers you're connected to know what video you're watching right now, and can identify you thanks to your IP address. Maybe this isn't too big of a deal when watching mainstream stuff, but for things like porn it can be a bit touchy…

[1]: https://www.lumen.com/en-us/edge-computing/mesh-delivery.htm...


hello :)


Bitchute doesn't use P2P, but it's a HUGE part of their marketing. Their target demographic is technically illiterate so don't know any different, which really actually is a huge problem in general.

How many people fall for the latest buzzwords (like 'web3') without having the first clue what it is? Even investors aren't immune.


Here's a short version:

Joost, backed by the founders of Kazaa & Skype, tried this starting in 2007. It failed, YouTube and other streaming approaches won.

Some of that was content, but some of that was that bandwidth became cheap enough.

https://en.wikipedia.org/wiki/Joost


Asymmetric network connections are common and I imagine ISP networks are also not designed to process large amounts of peer to peer data. So besides the other issues, if this would ever become popular I'd guess the networks would struggle and ISPs would probably raise their prices.


True. But the increased use of video conferencing from home should add pressure to improve this situation.


Makes sense, and it definitely puts some pressure on ISPs for outgoing bandwidth. However, the bandwidth of a [put-your-favourite-videoconferencing-too-here] video upstream feed is rarely as high as a HQ video streamed from a video streaming service...


Yes, until the entire family is in a video conference at the same time.


Maybe the companies is some imposible because peertube or other have not DRM systems.

But public organizations such as BBC (in UK) or RTVE (in Spain) or France Télévisions or others, they must use this kind of technologies because they are good for opendata, save public money and other good things.


Plenty of good answers in this thread but the primary reason is that bandwidth is really cheap. There’s no reason the risk losing control of the most crucial part of your product offering.


I would say p2p always has that stigma about copyright stuff, but at the same time there is most certainly business opportunities in that field.


People are always inventing new ways to create a decentralized internet, while the answer really lies in just increasing upload speeds.


There's a whole host of reasons why you won't want to ever use P2P if you can avoid it, but there's one in particular I'd like to spotlight: privacy.

P2P is a privacy nightmare, by design. You are asking everyone who wants to watch your video to also host it, which means that everyone watching the video gets to know the IP address of everyone else who watches that video.

Back in the early days of casual online piracy, music companies were happy to be able to sue service operators like Napster and get them shut down. However, when P2P services evolved to distributed-everything, it made it a lot harder to do that[0] since the only thing the service operators did was provide software to connect to their particular swarm protocol.

So they just joined the swarm, downloaded their own music, and then sued anyone who sent it back to them.

Now, imagine all of the copyright claim and takedown fraud that happens on YouTube, except instead of censoring one creator's video, they start suing the people who watched the video. Yeah, no thanks. Centralized video services have many problems, but legal liability on individual users for using the service as intended is not one of them.

Bonus points: given recent GDPR rulings on data exports[1] I would almost surely argue that any P2P swarm violates GDPR, because it turns every viewer of the video into a GDPR data controller, and any US peer in the swarm would constitute a GDPR data export into a privacy-hostile country.

[0] Grokster was sued on an "inducement" theory of liability that SCOTUS pinched from patent law, but that relies on the conduct of how the service operators and software providers advertised themselves to users.

[1] It is illegal to use Google Fonts on an EU website because of the US CLOUD Act and the fact that any subresource provider gets your IP address when you visit a website they service.


Honestly I’d be using it but I still don’t know the GDPR situation with p2p




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: