The problem with P2P is that you have to make what you have public. That’s fine for a game update where it’s all the same, but for movies there are huge privacy implications.
* It's slower to deliver video since you're reliant on upload connections of end users
* It's less reliable (same reason as above)
* You have no guarantee that chunks will arrive in the right order to actually stream data (ie the end of the file might download first)
* You cannot send multiple different video qualities in the same stream and have the client dynamically pick the right bitrate for their connection (this is what current streaming services do and why you often see Netflix / Prime video / etc switch between quality mid-stream without having to restart that stream. I can write more in this if people are interested)
* It's harder to debug network problems if you are experiencing issues with video quality (been in enough stressful emergency meetings with network guys over the years - I can't imagine how much worse it might have been if we couldn't do a full end to end trace of the delivery)
* Time to start playing a new stream is longer (which end users might care about)
* It couldn't support live services where the video data is being generated near real time
The current methods for video delivery are actually really good, bittorrent would be a major step backwards. However for delivering other kinds of assets - such as patches to computer games - protocols conceptually similar to bittorrent are used.
Doesn't really matter, it is fast enough. If it isn't you back it up from a datacenter.
> It's less reliable (same reason as above)
Shouldn't be any real difference in reliability.
> You cannot send multiple different video qualities in the same stream and have the client dynamically pick the right bitrate for their connection (this is what current streaming services do and why you often see Netflix / Prime video / etc switch between quality mid-stream without having to restart that stream. I can write more in this if people are interested)
Sure you can. You only have to have a small map between play time and file offset for the different streams, the client will then pick whatever it wants.
> You have no guarantee that chunks will arrive in the right order to actually stream data (ie the end of the file might download first)
The client can decide this. There are torrent clients that do this already. Buffer 2 minutes, if some chunk is missing when 20 seconds remains pull it from a datacenter instead.
> Time to start playing a new stream is longer (which end users might care about)
No, you start streaming from the datacenter.
> It couldn't support live services where the video data is being generated near real time
Not sure, but if a minute is acceptable delay (depends on what is being broadcasted) it should be feasible. A live webcam for a tourist resort should be fine, a sports event, maybe not.
Spotify even used to work this way. When a user clicked play (or seeked to a different part of the song) the first (if I remember correctly) 15 seconds was fetched from their CDN. After that it used its own torrent-like system to continue and pre-fetch the next song.
Sorry but no it really isn't. You'd have to rely so much in your own data centre that you'd loose the benefit of bittorrent. Plus many services will use a commercial CDN, none of which current support bittorrent. So you'd end up having to build your own infrastructure there, which would be a great deal more expensive.
> Shouldn't be any real difference in reliability.
Home connections might drop off due to power cuts, router failure or any of the other numerous conditions that datacenters battle against. Home connections might get throttled by ISPs.
Where I work we offer 5 "9s" of reliability on some services, good luck asking consumer broadband to offer the same ;)
> The client can decide this. There are torrent clients that do this already. Buffer 2 minutes, if some chunk is missing when 20 seconds remains pull it from a datacenter instead.
2 minutes is an excessively long buffer compared to.how long most RMTP segments are (often 15 seconds or below).
> No, you start streaming from the datacenter.
So then why bother with bittorrent at all?
> Not sure, but if a minute is acceptable delay (depends on what is being broadcasted) it should be feasible. A live webcam for a tourist resort should be fine, a sports event, maybe not.
A minute isn't even close to acceptable delay. Imagine if you're watching a sports match and Twitter is ahead of your video fees, you wouldn't be grateful for the spoilers. This is a real world problem with video streaming and why they talk about getting the latency down to 5 seconds or less.
> Spotify even used to work this way. When a user clicked play (or seeked to a different part of the song) the first (if I remember correctly) 15 seconds was fetched from their CDN. After that it used its own torrent-like system to continue and pre-fetch the next song.
Spotify is audio only. People love to compare video streaming to audio streaming but they don't realise that HD video is an order of magnitude more complex to stream - in terms of bandwidth, syncing, dropped frames, etc.
It's one of those problems that seems easy to solve from a superficial level but when you start getting to the broadcaster level it's actually a great deal more complex than even Plex and other self-hosted streaming services would lead you to believe. (Disclaimer, I've worked at the broadcaster level)
I'm not saying there won't be problems. A major problem is the asymmetric nature of many consumer connections. Not only is the upload often a fraction of the download (that is the easy part), but the download speed can be greatly sacrificed if upload is utilized. Add to that issues that home users might want to use the connection for other stuff.
The nightmare and confusion surrounding "I can't game because someone is watching torrent-tube" will be real and add to that issues with ISPs that have a fixed quota or people on mobile (or tethering a mobile connection to laptop). Netflix et al would not like to deal with that kind of FUD spreading around.
And all this is before considering local ISP bottlenecks as it isn't what the network was designed for. A vastly superior option is to put a proxy on the ISP network itself (but yes, hard to do for small players).
Those alone are probably a magnitude worse than the technical issues you speak of. Even spotify got much flak for it, and no, I don't think anyone on earth think the bandwidth requirements of audio and HD video are similar.
Like I said, it's ever so easy being an armchair critic but try deploying this stuff at scale with SLAs covering 5 "9s" and paying customers then tell me you've got all the kinks in bittorrent solved ;)
The end will only be downloaded if the client requests it, which won't be unless the user is watching the credits...
But as you're clearly an expert in video delivery you should build a torrent based video delivery platform. You could make a killing (assun. Or alternatively you might discover these opinions you were dismissing might have had a point and actually video delivery is a lot harder than you first assumed.
This is where RMTP streams really shine through - they offer the performance (at the client level) and flexibility (quick swapping between bitrates within 15 seconds of chunked video) to maximize the video quality per client.
As for why Netflix doesn't look to great on your connection, I can only speculate there but the reasons might be more down to congestion on your local Netflix cache server or your ISP itself than RMTP streams being inferior to BitTorrent.
I can understand what you're talking about, but I think it's important to appreciate that as an end user I don't really care about what tech words I can tell myself about why things look bad. I care that one thing looks bad and the other does not.
My point is quite simply that RMTP is already a better protocol for video delivery than BT.
Yes Popcorn Time exist and in some situations it is comparable to Netflix, but not in all situations. In fact not even in most situations.
BT wouldn't allow you to deliver to other providers such as YouTube or Twitch (this is something I was working on last week with RMTP streams). BT wouldn't allow you to dynamically inject ads separate to the video feed (this means you'd need to encode your adverts into the video file so you cannot charge different rates for different days or viewing times - which is a deal breaker for most broadcasters). BT wouldn't work for live feeds (so it would be useless for sporting events - which is what generates a large chunk of broadcasting revenue). BT is noisier than RMTP at the ISP level (ie it actually costs ISP more bandwidth not less - and given how much many ISPs are already complaining about Netflix et al, the last thing you need to is expect those ISPs to offer Netflix free bandwidth in for the form of user seeding).
As you can see the point I was making right from the start isn't that you cannot stream video via BT but rather that there are already better protocols for video delivery than BitTorrent. Hence why professional video delivery platforms don't use BT.
I suspect the issue many commenters on here have is they assume that traditional video delivery is still a classic straight stream of data like the old days and like people are familiar with when downloading via HTTP or FTP. But that's simply not how RMTP works. Modern video feeds are actually chunked just like BT is, except that it's feed from a CDN rather than peered from end users.
Never stated I was an expert in torrent or broadcasting. But barely one who has used bittorrent can counter most of your arguments. As I've mentioned there are tons of issues (and as mentioned elsewhere in the thread, privacy alone would make it a non-starter for many). I don't believe in it for large scale, but not for the technical issues you presented.
Apologies if I came off too harsh, been a rough couple of days.
I honestly think privacy is the least of the problems. A bigger issue is just getting content owners on board to begin with
Edit: sorry to read you've had a rough few days. Hope things improve soon
(Someone already mentioned this, but seriously - this has been done - it works really well - as a polished commercial product it would probably work even better: https://en.wikipedia.org/wiki/Popcorn_Time)
The fact that you've acknowledged that it's not as polished as a commercial product should be a big hint that perhaps commercial products choose RMTP over BT for a valid reason. Perhaps even for the reasons you initially dismissed as nonsense. Perhaps you might need to read up a little more about how professional video deployment actually works before you assuming you understand video deployment better than all those engineers who do this shit for a living. Just because something conceptually works it doesn't mean it's any good when dealing with the expectations of paying customers who might want SLAs of 99.999% uptime, low latency live services (like less than 10 seconds), near zero buffering times etc.
The Dunning–Kruger effect is overwhelming in this thread but trust me when I say video engineers are not stupid people and if their lives could be made easier by using BT then we would definitely be seeing commercial uses of BT for video deployment.
> If you have to rely so much in your own data centre
Not sure if we are talking 15 second gifs or full-length films here. The thread started with netflix as an example and that's the context I've assumed. The startup-cost on a full-feature film or even a series episode is barely a fraction in the context so I really don't get your comment here.
But I would go even further and expect constant backing from your data center to help with distribution, depending on how incentivized people are to keep the client alive after they've seen something. I would also not, both for network performance and common decency+courtesy, try to squeeze everything out of my customers internet connections.
> Home connections might drop off due to power cuts, router failure or any of the other numerous conditions that datacenters battle against. Home connections might get throttled by ISPs.
None of that applies here, that's a huge point of bittorrent to be resilient against that so it just doesn't make any sense to bring it up. A datacenter is centralized so it needs to deal with it. Thousands of clients spread throughout the world will likely even have better reliability than a datacenter. All in all, no reliability lost. Throttled by ISPs are a real issue and I really highlighted those issues earlier.
> 2 minutes is an excessively long buffer compared to.how long most RMTP segments are (often 15 seconds or below).
I just made that number up, but yes, a torrent solution will need a larger buffer. That really isn't a problem in this context. What is a much bigger problem is the storage needed for seeding depending on how you want to solve that and how reliant you want to be on your clients (again, your points doesn't make sense, why complain about the buffer when the cache is orders of magnitude worse).
> So then why bother with bittorrent at all?
I wouldn't, and I never said I would. But the protocol would work for it, consumer ISPs and other factors might not for such a large scale operation. And the privacy implications are also bad, it would leak information to your competitors as well. Bandwidth costs clearly aren't that big of a deal either so the potential gain isn't worth it (talking netflix scale here). Also proxys at ISPs are a superior solution.
> A minute isn't even close to acceptable delay[...]
You can not possibly have read what I wrote.
> A bigger issue is just getting content owners on board to begin with
Content owners are notoriously irrational so I wouldn't argue against that. But torrents could still be encrypted.
Basically the protocol comprises of multiple streams of various bit rates. These streams are chunked into blocks of n seconds. This is often around 15 seconds or less. The client will hot swap between each stream if one chunk takes too long or downloads too quick. This ensures that the video quality dynamically scales to the best bitrate your bandwidth will handle and it does so quickly and with minimal buffering. It works for video on demand services but it also works for live feeds as well and the whole thing can be served over multicast or UDP for further savings at either the DC or across the wire.
Bittorrent is nowhere near optimised enough to compete on that level. The 2minute buffering you guessed at would easily look terrible in side by side comparisons with RMTP deliveries. Not to mention a whole boat load of hardware encoders (rack mountable gear used in broadcasters) are built for spitting out RTMP streams yet none (at least that I'm aware of) exist for BT. So you'd have to build your own gear as well as content delivery network. That alone would cost you far more than any savings P2P deployment might earn you.
If you have 2 minutes of buffer but realize you can't keep it up you can still switch to a lower bitrate, that will allow you to get any missing chunks and maintain the buffer going forward. So you can still adapt (from the networks point of view much quicker than the 2 minutes), but not nearly as fast and the end result (quality change) might lag quite a bit.
The way you discussed this came off as incredibly arrogant, so you shouldn't hold your hands up all of a sudden and claim you're being misrepresented—I personally expected you were a bittorrent developer disgruntled over lack of adoption for non-technical reasons or similar based on the vitriol and self-assuredness.
There is a way to point out that some of those things should not theoretically be a problem without acting like they have no practical merit whatsoever and that anyone who believes so is an idiot. When you cite audio streaming you especially make yourself look like a Dunning-Kruger case as delivering adaptive HD video involves many complex issues that are completely non-existent or trivial for audio. It's hard enough with full control over your own PoPs, and layering on P2P only makes it more complex. Content rights / privacy aside, you could never rely on this as primary delivery for commercial content, so in effect you'd be layering a bittorrent system on top of a traditional CDN setup as a cost saving measure, but you've have to be damn sure QoS didn't degrade in the process which is also non-trivial. Just because it's theoretically possible doesn't mean it's a good idea.
I felt the original message had the same tone and thought to answer with the same. I shouldn't have. And I went further. And I just couldn't stop. I shouldn't be commenting when I'm this frustrated (unrelated to this thread).
Again, I'm sorry.
I recognize the other commenters tactics as the same "throw a bunch of dubious claims at the wall, force your opponent to spend way more time refuting each one than it took you to make them, and as soon as they look like they're climbing out move the goalposts again" thing that I see in political arguments with half drunk friends and strangers on political subreddits :P
I'm sure I've seen a great webcomic that describes this somewhere but I can't find it...
The problem we have here is that a bunch of hackers have seen it can be done with BT (which nobody is saying it can't be) and then assuming that BT is suitable for someone like Netflix. If it were that clear cut then we would already be using it. But the fact remains BT doesn't actually stack up that good against already established technologies we use for video streaming.
I'm not saying this for the sake of an "ideological/political argument", this is a well researched point I've discovered doing my professional day job.
It's ever so easy to hack something together that works for a bunch for non-payment customers who just want to pirate something. It's a whole other ball game to build a production service with advertisers (where applicable) and other partners and sell that to paying customers.
What you're doing is comparing a kit car to a bus and saying they're the same because they both transport people. Then assuming the kit car can transport the 40 people at a time for 12 hours a day every day because you've made the previous logical assumption of equivalence.
So what would Netflix gain in doing so? They'd need to build new clients, convince content owners that P2P is secure, convince ISPs to seed content, pay for new infrastructure, etc. And the final product would be more expensive and _at best_ equivalent to what they already have - though realistically it will be worse on all but the top end viewing platforms (ie laptops and desktops). And for added bonus this infrastructure "upgrade" then removes the possibility of Netflix running live services and dynamic ad injection, etc.
So it's more expensive, less future proof, less performant on low end internet connections and wouldn't work on any of the existing smart TVs or streaming HDMI sticks.
Like I've said in other comments, BT is fine for hackers and pirates - for people who want to run something for free, willing to run it on a laptop or even willing to be inconvenienced a little for the sake of running something different. But it simply couldn't compete once you start scaling it out as a commercial platform with all of the customer expectations from non-techies and the unseen requirements / complications that happen with professional video delivery platforms behind the scenes. Plus for all of this talk about how good BT could be, people seem to miss the key point I made right from the start: that existing delivery platforms actually aren't that bad.
 I actually have no qualms with the security side of P2P in terms of protecting copyrighted content. But content owners are notoriously slow to adapt to change and hard to please. Getting them on side might prove an even tougher problem than engineering BT to compete with RMTP at the commercial scale.
Also things like ad injection are not features for the end user.
RMTP is a proprietary protocol developed by Adobe so it's a good fit when you want to screw your customer with DRM and other stuff.
Similar point for as injection. You might consider it anti-user however in many cases ads are what pays the bills and thus in traditional "over the air" broadcastering it's actually more important to avoid down time during as as spots than it is during a programme airing because the cost of compensation is greater.
Ultimately in professional video delivery you'd have more than one type of customer; obviously you have the consumers watching the content but there is also the people paying for ad slots. Plus in many cases you wouldn't be the content owner so you'd have a third customer who is the person paying for your services to deliver the video.
So the features you might consider to be anti-user (and to an extent I don't disagree with you) are still there for paying customers. Or to out things a different way, video engineers would prefer not to add the complication of DRM and as injection into their infrastructure if they didn't have to.
As for RMTP being Adobe owned, there are other video delivery protocols around but RMTP tends to be the preferred platform for B to C (broadcaster to consumer). Or at least that is the case in the places I've worked. Eg it wouldnt surprise me if Apple / iTunes did their own thing.
Actually S3 can deliver blobs through bittorrent: https://docs.aws.amazon.com/AmazonS3/latest/dev/S3Torrent.ht...
The 5GB limit would mean you couldn't use it for movies but TV shows should be fine
> It's slower to deliver video since you're reliant on upload connections of end users.
This has been scaling really well for small unfunded projects that need to deliver big binaries, such as open source distributions.
If you have large corporate or state level actors trying to censor the content, then this becomes the more reliable method.
> You have no guarantee that chunks will arrive in the right order to actually stream data (ie the end of the file might download first).
This is not fixed by the protocol, but by clients optimizing for total download speed. I think some clients allow you to download more sequentially.
> You cannot send multiple different video qualities...
Video torrents often come in 720p, 1K and sometimes even 4K versions. So you could pick in advance. But that's not dynamic mid-stream like you mention.
Since you offered to write more: Is there such a thing as a progressive codec for audio/video? I vaguely remember video chat codecs doing something like this, where you get progressive degradation if you loose packets.
> It's harder to debug network problems...
I think this is the main reason Skype and Spotify moved away from the P2P tech that they used initially. I hope this becomes less of an issue when projects like LibP2P and IPFS mature and become as reliable and common as TCP/IP and TLS.
> Time to start playing a new stream is longer...
You can solve many of the issues mentioned above by having high quality sponsored seed servers. This is basically a hybrid approach. The seeds would guarantee good services and the P2P part would reduce load on the seed servers where possible.
Not Torrent, but there's community of P2P television streams, quite popular for big sports matches. And from what I hear great quality. This is a completely different P2P model, more like a multicast stream. Can not do the sort of on-demand streaming that Netflix would require.
I just can't ever see it being practical for video deployment. At least not in its current guise (thus whatever P2P model might succeed in the future wouldn't be authentically bittorrent).
If anything, with AWS and other "clouds" offering services for video delivery (Media Live, CloudFront supporting RMTP etc), I can see bittorrent becoming ever more irrelevant for video distribution. At least thats the trend Ive been observing recently. But tech can move in unpredictable directions so never say never.
Yes, torrents have some downsides - forever they can be worked through at some cost of quality. I might see some delays occasionally for example while I wait for an it is sequence chunk to arrive.
However for some content/use cases, perhaps that will be acceptable.
Right now the economics favor a dedicated and reliable changes from the source - e.g Netflix.
Perhaps one day another use case will appear where the massive cost benefits of piggybacking off other people's resources comes into its own.
But I guess "never say never?"
In terms of video streaming services, there's other ways to reduce bandwidth costs without resorting to P2P. Multicast being the prime candidate as that's where you send one IP packet which targets multiple different endpoints (rather than unicast which is the typical point to point method of IP comms)
There is a simple way to fix that. You wouldn't want to force everyone to upload anyway (some are on metered connections etc.), so what you would do is offer a nominal discount for uploading, but if you choose it then you don't just host what you watch, you host whatever needs more hosting capacity. Then random third parties can't tell what you watch because what you host is random.
This reduces the efficiency somewhat, but it's easy to minimize that: Don't give someone a hundred movies to host, give them one movie and have them host it a hundred times (over the course of a year).
You're also not considering the power law distribution. If nineteen customers have 1Mbps uploads and one has symmetric gigabit, the average is >50Mbps. And the discount could be per-byte -- set it at half whatever the cost of central servers would be.
Put on your product hat: as Netflix, you want to control every aspect of the experience to ensure its absolutely seamless. When you outsource distribution to a huge set of third parties that becomes much harder to achieve.
Connections that are known to be slow can seed just the ends of files with faster customers seeding the entire file.
I imagine the maths on this has been done and the cost of bandwidth is much lower than the cost of engineering the solution though.
It's entertaining to expand the idea beyond bandwidth...what about CPU/gpu time on my powerful consumer desktop sitting idle/off all day? Or storage space? The computer would need to be on to seed... Would people pay to also use the idle compute capacity?
Security is a problem, but it could be used to segment customers and providers. (Big corporations bidding for AWS capacity, tiny personal sites hosted on a CDN of cheap consumer hardware).
Tagline it "Uber for the cloud"!
not really, because have to pay for the electricity costs of leaving your computer on 24/7. assuming a modest 50W power consumption for a PC, that works out to 36KWh per month, or $3.6 at 10 cents /Kwh. That alone wipes away your $1/month you're saving. also, that's a third of the monthly plan cost, how much of a discount can netflix really offer here?
as for hard disk space, I'll grant it's "free" space (as in the consumer wasn't going to use it anyways), but running consumer drives 24/7 with frequent seeks can't be good for them.
Yes, it depends on country. But I also suspect the total upload bandwidth for typical users in typical locations simply isn't there to make the idea work at scale.
(all numbers are hypothetical)
I'm actually slightly surprised that nobody at Netflix has already done that. Netflix Steambox would be a thing people would pay money for, and then you don't have Google/Apple/Amazon/Comcast/etc. diverting your customers to their video services because your customers are using their devices.
Also notice that you don't need 100% of users to upload. Having 10% of users with a 5:1 ratio still cuts your bandwidth costs in half.
World would be a lot better place if we starts becoming comfortable with sharing our real thoughts/preferences openly, and it would simplify transition to truly public-sprited internet platforms based on etherium, etc.
I can understand the hesitation. It's warranted, but there is progress such as this very threads response.
>> According to an interview in 2012 with Chuck Rossi, a build engineer at Facebook, Facebook compiles into a 1.5 GB binary blob which is then distributed to the servers using a custom BitTorrent-based release system.
Above part is taken directly from Wikipedia article on Facebook - "Technical Aspects" section.
But you're probably right in that this is a relevant problem, as IP addresses are considered PII (in Europe at least), so leaking them could be problematic. That said, as Netflix already has a contract with each of its subscribers it shouldn't be too difficult to modify it to accommodate that kind of data sharing.
I imagine the real problem is ensuring streaming quality in such an infrastructure, as it's very hard to predict when a node will go down (e.g. user turns off laptop) or massively degrade in bandwidth (e.g. user starts downloading other content), and there's not much that customers hate more than interruptions when streaming movies (as you can probably attest firsthand). So given all these constraints it might be cheaper just using a CDN. Also, the bandwidth costs per customer probably make up a negligible share of overall costs (licensing is probably the biggest part) so it might not be worth optimizing this at all. I'd estimate that when I used Netflix I streamed maybe 20-30 movies / episodes per month (sometimes more), which would total at about 50-100 GB max (?) given the right encoding. For a company like Netflix that distributes probably petabytes of data a single GB should cost significantly less than 1 cent, so my bandwidth cost to Netflix would be between 0.5-1 $ per month (I even think it's much less probably).
Also, have a large and single-purpose Tor deployment controlled by a single provider would mean there would be no privacy from that provider and anyone able to send them subpoenas. I think that this weakens the privacy provided to the point where the decision to use Tor at all becomes questionable.
Is there something I've missed? Can you help me?
It was removed in 2014 in favor of central distribution.
People were very upset about use of bandwidth. This would have been around 2007 when most Internet accounts in the UK had download limits.
P2P is an interesting approach to large scale media distribution, but aside from the bandwidth issues, I imagine they were under immense pressure from copyright holders to steer clear of it.
BitTorrent is abysmal for me. It quickly saturates my uplink bandwidth, destroying downlink performance. Or I cap it, or it discovers a cap, and my upload rate is slow enough that I'm labelled a leech in the BitTorrent network and get a slow download rate.
World of Warcraft's updater was absolute trash for me. I always got far better transfer rates turning off P2P and sucking up the congestion on the direct download option.
BitTorrent simply doesn't work for most of the world's commercially available home links.
If there was some kind of micropayments infrastructure for seeders/leechers/content owners, I would happily seed arbitrary-but-legal torrents with my 1000/1000mbps connections in exchange for credits towards media or cash to cover the connection ... While I imagine you'd pay to download using BitTorrent, with some tiny fraction going to me (proportional to resource consumption) but never seeding to others.
Many companies back then tried to startup based on business model of doing this via end users as you suggest. It was mostly sunk by users’ pretty reasonable opinion they didn’t want to donate their upload bandwidth on their metered plan.
Was actually a torrent.
Wouldn’t that be amusing. How would we know any difference?
I recently went to a small theater to watch a movie. It didn't start. After 10 minutes I went to talk to a manager. They had forgotten to download the licensed movie for the day and that download would take a couple hours. I had to get free passes to come back another time.
edit: A normal DCP (digital cinema package) is around 100GB, bigger if it's 3D/4K/Atmos. Some of the biggest packages can be 500-600GB with all versions (2D, 3D, dubbed, hard of hearing, atmos, auro, DBOX and so on). So not really feasible to download that via the Internet in a couple of hours.
A perfect use for it would be to distribute VNR’s (Video News Releases) and other audio and video from public relations firms and large companies.
The big obstacle is introducing any new protocol into a large bureaucracy.
It took a company I work with eight weeks to get their artist an SFTP program for his computer so he could send material to the printing companies; and another month to convince IT to open the ports in the local network.
By then he’d just sent a bunch of DVD’s by messenger.
Can you please elaborate a little bit? My business understanding of VNRs is that they are not in demand per se, and are pushed/pitched out-of-band to (hopefully) receptive editors and news directors. Is this true and what does P2P add to this?
For example, if you're a consumer products magazine, you will probably want whatever VNR Procter & Gamble has for its latest dessert topping/floor wax. It would be massively simpler for P&G to provide a torrent link on its website that interested publications could just click.
Or, let's say you're a construction company that just built an awesome new bridge. A VNR with drone footage of the construction process on your website that could be downloaded via torrent with a click would be helpful for the TV stations in your market.
I sometimes work with real estate developers. At this time, they're all sending video, huge photos, and massive PDF books to print, TV, and web publishers via Dropbox. Some even use Flickr for press photos. Torrenting those would seem to be a better method.
Further question: I too work with real estate developers. You're suggesting they publish a link to assets where the browser depends upon a torrent application to complete download, no? Or, alternatively, did things progress while I wasn't looking and now torrents can be completed in-browser?
I think the larger picture here, is if your Netflix, you're happy to centralize, put distro points of ISP networks, etc. to provide a great user experience. How many people would be annoyed about Netflix dragging down their internet connection to make the service a couple of bucks cheaper?
In the US I presume, where ISPs are monopolies and probably had to be in on this to make it work.
> How many people would be annoyed about Netflix dragging down their internet connection
Why would it drag down anyone's internet connection? It's not hard to make it behave well and self-throttle, when internet connection is used for something else.
Works today in Chrome, Firefox, edge, etc.
In Brave--Brendan Eich's new browser--you can even drop a magnet link directly into the URL bar! Uses WebTorrent under the hood.
If there was a financial incentive every browser would have been capable of downloading files off BT as seamlessly as off HTTP.
i don't even care whether the video starts immediately. im perfectly happy to wait ~10mins to get a decent buffer of chunks at the beginning of the file.
Encrypted or not, BitTorrent traffic still looks like BitTorrent traffic.
The BitTorrent protocol has a peculiar connection pattern.
Or from a webrtc video call / ring  call.
Or a person downloading a few files from 10 to 20 different websites.
I don't see how my computer connecting to 10 ips is characteristic of bit-torrent and not of performing encrypted communication of another type with said ips.
I'm not expert, but I can imagine how it might be possible to determine BitTorrent traffic from Xbox 360, voice call, or simultaneous downloads from 20 different websites, using flow analysis and some other data points, to a fairly high degree of certainty, in some cases.
In support of my amateur assessment I present the following Wikipedia entry on the subject:
"Some ISPs are now using more sophisticated measures (e.g. pattern/timing analysis or categorizing ports based on side-channel data) to detect BitTorrent traffic. This means that even encrypted BitTorrent traffic can be throttled. However, with ISPs that continue to use simpler, less costly methods to identify and throttle BitTorrent, the current solution remains effective.
Analysis of the BitTorrent protocol encryption (a.k.a. MSE) has shown that statistical measurements of packet sizes and packet directions of the first 100 packets in a TCP session can be used to identify the obfuscated protocol with over 96% accuracy.
The Sandvine application uses a different approach to disrupt BitTorrent traffic ..."
I guess like everything, it's an arms race; and a sufficiently determined network monitor probably has the average BitTorrent user blocked. Might not be worth the effort though.
because you're maybe connecting to 100 players max, with relatively low bandwidth use
>Or from a webrtc video call / ring  call.
again, relatively low bandwidth
>Or a person downloading a few files from 10 to 20 different websites.
those websites run on port 80 or 443 whereas torrents use random ports > 1024, so that's a dead giveaway there. plus most people (even powerusers) don't have 10 to 20 parallel downloads from multiple sites. even if it's really someone downloading from 10 to 20, that probably puts them in the 99.99 percentile of bandwidth use, and they probably should be throttled anyways.
That's a really strange sentiment to me. I'm paying for 100mb/s, not for "100 mb/s, in certain specific circumstances over specific protocols". Ones and zeros, the rest of it is _my_ concern, not my ISP's.
Of course, those were business lines and all those numbers were actually in the fine print of the contract. As far as I've been able to determine, if your residential ISP is only oversubscribing at a 10:1 ratio, you're pretty lucky (I've seen some reports from industry consulting firms that suggest 50:1 is more common), and the chances are they're not guaranteeing a minimum speed they can be held to.
In most operators, limited bandwith users are oversubscribed and unlimited bandwith users are linked to dedicated channels. Oversubscription ratio is around 20:1 for DSL and 100:1 for mobile here in Turkey.
Edit: I’m getting downvoted, so I went and looked it up. It’s not “thousands”, but it’s close to $1000/mo, from the first provider I checked: https://imgur.com/gallery/9ZdlqXt
On the reliability side, you can easily include an HTTP web seed, though as long as the company offers the torrent on their fileserver, there isn't much benefit for doing that.
Windows 10 P2P Distribution for Updates: https://lifehacker.com/windows-10-uses-your-bandwidth-to-dis...
Blizzard/HTTP Web Seed: https://en.wikipedia.org/wiki/BitTorrent
Is it a net (for the bandwidth of whole internet) gain? I believe the major advantage is to transfer the cost of distrubutio away from the original host and distrubute it among those downloading IT. Does it reduce bandwidth used appreciably or just shuffle it around?
On the other hand, big networks can be more efficient by caching things near the edge so that the total “bit miles” (I.e. the total over all packets sent of distance traveled * bits of data) are decreased.
The protocol has quite a bit of overhead, so things can be worse among a small number of clients. It’s also worse than having caching servers on each network.
From a business standpoint it does not matter much, but from a pure network perspective it’s likely worse.
Ignoring that, bit torrent is likely to do the bulk transfer over different links than a download from a central server. This seems likely to result in an overall higher number of bytes downloaded, but it's hard to model. It can also lead to congestion in parts of the network that were not normally congested.
This was not an issue with the last mile connection itself, but a middle mile peering problem. From the testing I have done, on certain carriers that play games like this (Deutsch Telekom in Germany, Centurylink, Verizon & others in the US, and most Asian/South American ISPs) torrenting updates to users is significantly faster than an HTTPS download from Amazon, Google Cloud & the like. Once you get a handful of seeders internal to the network, the other downloaders in that network start to get the torrent significantly faster than they would over traditional HTTPS.
If the content isn't extremely time sensitive (eg: live sports games, video calls) and is of a noteworthy size, using P2P tech to speed up downloads is not a bad idea.
E.g. with my ISP, the bottleneck isn't in the last mile (that's all GPON fiber) but rather in the national network and their transit to the internet. If more transfer was made to local-to-local, it would reduce the amount of traffic that needs to go cross-country or out on the internet.
Unless bittorrent understands the topology of a network, it will never be more efficient than a structure that takes advantage of the topology.
E.g. Netflix makes servers that can be distributed to the ISP's and can serve content to those consumers directly from the ISP itself, and not over the ISP's outgoing link.
E.g.#2 Multicast clients register to receive multicast streams and the routers know how to propagate multicast.
I can make a very low latency connection but only delivers at most 1 packet every 10 seconds. It responds in 5 milliseconds say, but it won't let you have another packet for 10 seconds.
Or I can have a high latency connection that delivers a huge amount of bandwidth. It might take 10 seconds to start the firehose, but once it starts, it's a firehose. This is ironically the problem that the internet has with "Bufferbloat", and high bandwidth radio connections in space.
A topological solution would be counting the number of hops. There are internet protocols that do this, it's just that I don't think that BT takes advantage of it.
But none of this is necessary, fast clients simply send more data and dominate download bandwidth. Doesn't matter why they are fast, whether they are simply very close or just on a path with more capacity. Everything is naturally efficient.
Bittorrent is not an effective means of serving data from a 4G device. In fact, if anything the bandwidth on 4G arguably costs more than bandwidth from your Fiber ISP.
We use services like Youtube and instagram to push our data to others. And the efficiency from the phone perspective is the same as Multicast because the phone sends data exactly once over the upstream link.
You could say, "Well, duh, nobody should use bittorrent over 4G because of bandwidth issues." And that's exactly the point. If Bittorrent were so efficient, we'd be using it on 4G instead of sending it to youtube, etc.
You can't have it both ways, in other words, you can't say, "Bittorrent is the most efficient protocol man ever created, but only if you don't use it on 4g."
On the other hand, multicast with fountain codes (e.g. RaptorQ) sure seem to be a super efficient approach that might work everywhere.
That's fine. So you connect to that peer and get one packet every ten seconds. You also connect to the other 1000 peers with the lowest latency, some of which are likely to be faster and in any event will be fast in aggregate, and none of which require traversing intercontinental fiber.
Distributing servers to every ISP doesn't seem very efficient. It may work in the US where I you can count the number of ISPs on one hand, but where I live there are 100+.
Has multicast ever worked?
Actually that's the way most CDN's work. Instead of having 40 hops to your customer, you may only have to have 10 because the first thirty hops are from you to your server.
Less congestion issues, higher uptime, lower latency and faster bandwidth usually.
> Has multicast ever worked?
In my experience with military networks, yes. Especially when one multicast packet can be received by 20 listeners without repeating the packet 20 times on the physical layer (especially radio links).
Sending data via bittorrent would saturate links you wouldn't want saturated.
(I don't know if it does this, but it's been around long enough that I'd be surprised if it doesn't).
The most efficient network distribution is one where the data only travels each network link once for all destinations. This is the way multicast works, e.g.
(And this all makes me feel old.)
What do you mean by “uploading”? You seed torrents, you don’t upload them (where?).
Another Matrix-related torrent from yesteryear.
This is my favorite bit :)
You can still have the torrent link in the description if people want to download it, you will reach a wider audience and you might actually make some money off it to fund future fan projects.
you just can't/shouldn't monetize it because that would motivate the owners of the original film to stop you.
That is incredibly naive.
Content ID has been picking up things as innocuous as applause at the end of an entirely novel recording as being copyright infringement.
given the amount of star trek fan-films on youtube (there are thousands, literally) spurious takedowns would be noticeable. so i don't think i am naive here. as long as all your sound and images are original or from a creative commons source a takedown should be unlikely. (that doesn't mean it can't happen though)
There's also examples like this, about a Star Wars fan film, where Warner claimed that silence infringed on their music copyright! https://www.wired.com/story/the-star-wars-video-that-baffled...
Edit: since people are continuing to comment about this, I've added the word "surviving" above.
Windows XP is older than torrents, but in my mind it feels much newer for some reason. I wonder why.
I alternated between using Limewire and Bearshare back in the day. Vague memories but somehow their UI stuck with me.
Makes me even more happy we have Spotify nowadays :)