Hacker News new | past | comments | ask | show | jobs | submit login
Webtorrent – BitTorrent over WebRTC (github.com/feross)
362 points by rvikmanis on Nov 21, 2015 | hide | past | favorite | 108 comments

I've been using https://instant.io/ a bit to share files with people. You don't have to care about captchas or size thresholds because you aren't storing anything on the server, except for the tracker file. The more people that download and stay on the page, the faster it will be for everyone else too.

Of course, it's not perfect. It appears to try and load the entire file to memory to seed, which makes sense, but that means you can't transfer a 1GB file without using 1GB of RAM...

I use http://www.put.io for this and highly recommend it. If someone has already downloaded the tracker you are trying to get you get it instantly, so for anything even vaguely popular you are able to get it (or after the first person downloads the file the rest are just pulling directly from S3). They also have great customer service, ios and chromecast apps etc.

Someone has a swank Mac menubar app for Putio that's a browser helper; any magnet or regular torrent file gets added to your put.io queue simply by clicking the link within your browser.

Oh I didn't know that, sounds like this:


thanks, I'll try it out.

That'd be it!

Check my client http://diegorbaquero.com/bTorrent/

The help me seed it is not working atm, but the idea is that you don't have to stay seeding it.

It has to that because it needs to generated the hash of the file, before it can be seeded! :)

in theory it could be written to store each chunk as a separate value inside of indexDB because when seeding you should only need one chunk at a time...

If anyone feels like testing it out, I was actually just playing with webtorrent a couple days ago: https://stark-springs-9580.herokuapp.com/

It's very basic, and doesn't have proper error messaging. Would recommend only attempting to stream mp4 (h264) if you're giving it a shot.

Do you have this code on GitHub? Would love to take a look!

I don't, unfortunately. The webtorrent docs are very straightforward though. And if you check their npm, they provide a link to cdn hosted source!

Note: sorry for any grammar issues, on mobile now

this reminds me of http://ipfs.io/ although IPFS doesn't run natively in the browser (it's a go app) and has much wider objectives (ie. the whole web site is loaded from the swarm, not just specific media objects, and it is trackerless, essentially).

IPFS doesn't run natively in the browser yet


i stand correct. this will be awesome. :)

Could a website use this to host/download torrents in the background unknown to the user?

Yep that's basically what WebRTC data channels do. And the NYT was abusing it to get your "real" IP to fingerprint users better.

WebRTC should always be gated behind permissions. Most uses are video/audio, which already require a prompt, so no problem. Other websites shouldn't have this capability, so annoying users is no problem. But that's not very palatable, as it means acknowledging that WebRTC data doesn't really belong on the wider web.

Honestly, what are the real good uses for it outside of media? Games? WebTorrent?

I'm rather annoyed that such a feature was deployed, giving any webpage the ability to override my proxy settings or otherwise change the normal browser networking behaviour.

In the good old days of browser companies making browsers for the users (Opera up to 12.x) you were able to configure EVERYTHING per web address


localstorage, separate widget localstorage, separate userjs storage, Web SQL - you could limit/disable/define everything with fine granularity(disabled, quota, popup if quota exceeded) globally and per domain.


Can you imagine something as crazy as letting user decide which sites are allowed to run Javascript? use localstorage? leave permanent cookies? use plugins? or something as weird as defining custom useragent string per website? All possible in Old Opera, No other browser will let you do that now afaik.

Today? Today you get Firefox forcing Pocket, Google serving Voice control binary blob, and all around people coming up with things that cant be fine tuned at the user level like webRTC. We are slowly moving to a point (wtf webassembly) to a point we will lose any control over whats flowing down the wire from the server and runs on our computers.

The Vivaldi browser might be for you then https://vivaldi.com/

No. Last time I checked they didnt have 'preferences per domain' functionality :( Not to mention 100 tabs open in Opera 12.xx takes ~1.4GB of ram, Blink uses up to couple of hundred megabytes per tab. Not to mention the little things, like a proper tab configuration (minimize on click, close disable button), custom mouse gestures?

Dont get me wrong, I still have it installed along with newest Opera, but they are my 'something doesnt work in 12.x' browsers.

Thank you.

>Honestly, what are the real good uses for it outside of media? Games? WebTorrent?

You could use it to implement p2p collaborative editing in a productivity app, which could actually give you privacy and security benefits that aren't available when a central server is involved.

No, I don't think this is correct. WebRTC requires you to trust the server (webpage) - there's no way to verify a peer. Think: from an end user perspective, what's the difference between an editor using websockets vs one using data channels? Nothing; they appear totally identical.

So sure, your data might go direct, offering therorical security. Similar to how Stripe and Braintree don't improve security against a malicious server (just change the form/js and it's game over), but help against accidental problems. But this isn't truly providing the end user any guarantees. But it's good marketing. Like Cryptocat.

Keeping features out because of privacy concerns is misguided, since it's a lost cause. Look into browser fingerprinting. There are just so many vectors for tracking Web browsing behavior... literally hundreds. The only way to truly browse privately is behind a VPN or TOR from inside a virtual machine with no non-volatile storage so that everything is wiped between each VM execution.

Security is a legitimate concern, but the browser is already completely exposed since it downloads and runs arbitrary code all day long.

I do have some objection to WebRTC but it's more architectural. WebRTC is overly complex, overly monolithic, and bloated. It tries to do too many things with one standard.

Except Tor no longer works as WebRTC ignores your connection settings (in Firefox anyways, I think). Up til now the networking for a browser has been straightforward, now it's got a whole extra model that's unneeded by default, yet enabled anyways.

Also, as per EFF's Panopticlick, fingerprinting isn't nearly a lost cause. And even then it's thrown off by changing UAs due to updating versions.

You can control WebRTC's network behavior, including forcing it to go through a proxy, by using this Chrome extension:


It's published by the Chrome WebRTC (of which I am a member) for just that purpose: to let you control that behavior.

That's neat, but oughtn't the browser be respecting the proxy settings by default? If tags like <img> added an "ignoreproxy" attribute, folks would be rightfully upset, wouldn't they? So why should data channels be any different? (Video/audio is fine as the user is notified first.)

Don't use Tor this way; browser fingerprinting makes it useless regardless of WebRTC. Use a Tor bundle in a VM.

While I agree with the advice, it's not useless. If you run a normal browser with no plugins via Tor, your IP won't leak except via exploits. WebRTC leaks by design, then handwaves browser fingerprinting as a shield.

Panopticlick is thrown off course but that's just because they don't actually try to track you persistently. Any party that does want to track you will use many more channels than just the UA and will happily re-acquire you if all you just did was upgrade your browser.

If you:

- upgrade your browser

- switch to a new IP

- wipe all your cookies

- change your browsing habits dramatically

(don't visit the same 10 websites over and over again)

Then maybe you could avoid re-acquisition.

Panopticlick does use several methods. But, by far, the biggest thing is the UA (and most likely measured incorrectly as I explained). I'd bet using a popular OS/browser probably only leaks like 4 bits' worth. The next highest thing is resolution, but only because I tried it on a phone with unusual settings (Huawei Mate 2 with scaling).

IP address is a big one, but if browsers respected your explicit proxy settings instead of ignoring it for WebRTC, then changing it is easy. History, supercookies, and other stuff is taken care of by private mode, or, at worst, wiping out all browser info (private mode doesn't clear HSTS).

My point is that all is not lost, that supercookies are not a given. Thus saying WebRTC gets a free pass because things are already broken is simply wrong and a misleading argument to push data channels in where they don't belong.

Surely one should use the Tor browser bundle these days?

> Keeping features out because of privacy concerns is misguided, since it's a lost cause. Look into browser fingerprinting. There are just so many vectors for tracking Web browsing behavior... literally hundreds.

This doesn't mean we should make an existing problem worse.

That horse has left the barn and is half-way to Alpha Centauri. The right way to achieve private browsing is with isolated VMs with randomized browser settings and characteristics, and you can do that with or without WebRTC.

> The only way to truly browse privately is behind a VPN or TOR from inside a virtual machine with no non-volatile storage so that everything is wiped between each VM execution.

And even then you might want to consider building a new image each time with a random os / browser combo. Be sure to install some random toolbars and extensions and enable random configuration options too or else you might just end up leaving handprints instead.

What kind of an RTC API would you like to see?

Separation of concerns, and individually useful APIs that DWTADIW and out of which a more diverse set of things can be constructed. WebRTC is a monolithic all-or-nothing blob of capabilities tied too closely to specific use cases like video chat.

A UDP API for sockets (with user approval required and some restricrions), then perhaps ICE or maybe that should be in JS. Then an audio/video sampling and compression API.

That's a pretty bad risk then. In some countries downloading a torrent or hosting one can get you booted off the net or in even deeper trouble.

Downloading isn't a concern, as websites can already force you to download a file. <img src=...> The new risk is P2P connections that bypass HTTP settings, and perhaps uploading. (You could already upload by making a POST, so it's only in combination with P2P stuff that this is novel.)

If WebRTC prompted or forced use of the HTTP connection settings in the browser, I'd have no issue with it.

That's true but this combination allows for first downloading then serving up illegal content to many consumers and the penalties for that are much harsher. A post would go to one place only (normally speaking) and would not suddenly allow your browser to become server, making you a point of distribution.

So instead of "Someone else did it on my WiFi", people will now say "I didn't transfer those files, a webpage did it without my knowledge". Sounds like a much more solid excuse actually.

Websites can already link to illegal content without user consent. Is this really a new threat?

Linking is one thing, downloading, then hosting it is a completely different kettle of fish.

I guess the only hope is that this will end up with so many innocent unsuspecting people serving up illegal content that the laws will have to change?

I've used it for sensor fusion projects, using multiple smartphones as sensing devices all connected back to a main PC.

in firefox

set true

There is a gratipay account for WebTorrent if you want to support them: https://gratipay.com/webtorrent/

Looking forward to a WebTorrent-based Popcorn-Time app, itself hosted over WebTorrent...

WebTorrent's creator doesn't want it to be related to piracy. On the other hand I shared Inside Out to my whole Uni with WebTorrent. It worked.

http://popcorninyourbrowser.net used to be a thing using this. As you can see, it's down, but it's certainly a possibility

It says right there that it used the Coinado API, not WebTorrent. Coinado does the download on their servers and streams it over HTTP.

Ahh, i didn't know what Coinado is. I know I saw a demo that did use webtorrent (or similar)

Quite a nice file sharing implementation done here: https://file.pizza/

sort of reminds me of that torrenttornado javascript extension by one of my favorite Mozilla add-on developers


I see one issue here. Who is going to seed? Someone keeping a web page open after they got the download is even less likely that someone keeping their torrent client running after they are done downloading.

I'm working on a client that helps you seed the file for a while http://diegorbaquero.com/bTorrent/ it's still on alpha stage

Ferros did a great job with this. Full stop

Please excuse my ignorance. Is Node required for this to work?

Nope! Just plain JS.

I have another question. Is it possible to write this in Python?

You can use a native webrtc library in Python to implement the webtorrent protocol. Half the goal of the project is that desktop torrent clients can eventually support it through webrtc helper libraries so they can act as hybrid clients that can seed to both browser and native participants.

Server down

Perhaps the link should be updated...

Good idea. Url changed from https://webtorrent.io/, which stopped responding.

"a browser-based WebTorrent client or "web peer" can only connect to other clients that support WebTorrent/WebRTC"

until every torrent client supports it is not as useful as it could potentially be

Latest archive.org crawl: https://web.archive.org/web/20150922121717/https://webtorren...

Google cache (appears broken): http://webcache.googleusercontent.com/search?q=cache:https:/...

Edit: It appears that the OP's link was changed. Original was https://webtorrent.io/

That was fast

Why would I ever want to do this in my browser? I cannot think of a larger waste of resources except for pdf.js.

Please don't post comments that are dismissive of new work. Acerbic swipes poison the atmosphere, and we want HN to be a place where discussion of new work can thrive.

It's of course fine to ask what a project is for or how it addresses a specific concern, criticize substantively, and so on, but there should always be a baseline of respect for someone who tried something and put it out there.

It isn't new. I've seen this thing at least twice before. I seem to remember last time it even said that it couldn't interact with normal torrent clients.

> It isn't new.

OK, but that doesn't change the important point.

Your comments make it clear that you know quite a bit in this area. Instead of venting annoyance, why not teach readers some of what you know? If you did so without snark you could add a lot of value to a thread like this. People are obviously interested in the topic or they wouldn't have upvoted the story to #1.

I can't teach anyone anything. I have neither the knowledge, skill, temperament, or desire. People will continue to implement things in a browser, poorly, and there is not a damn thing I can do about it.

> or they wouldn't have upvoted the story to #1

The cynic in me says that happened because this is "X in Y lines of javascript, in the browser!". The one thing that is guaranteed to catch the attention of people here is reinventing the wheel--poorly.

All you need to do to teach readers is to post comments with real information in them. If you can't or won't do that, you should at least spare everyone the comments without real information in them.

Yes, because it requires a BEP (http://www.bittorrent.org/beps/bep_0001.html) to be added as an "official" extension to BitTorrent. Eventually it will happen. I'd hope that by posting it again to HN, someone might add it to their list of things to do in their Infinite Free Time.

adding a BEP will not fix webtorrent, it will still be incompatible with µTP / TCP transport and even more importantly the DHT. It's shoving the responsibility of being compatible onto everyone else because browser environments are inherently crippled, they do not provide access to such simple things as TCP/UDP sockets and thus can never be truly interoperable with protocols that run on them.

It smells a bit like embrace-extend-exterminate to me, even if it's not intentional, it currently is a natural consequence of "claim to do X in a browser" without access to the underlying primitives necessary to do X in a standards-compliant manner.

In my opinion browsers need to provide thinner abstractions over posix apis (with the appropriate sandboxing and opt-in where necessary).

>It smells a bit like embrace-extend-exterminate to me, even if it's not intentional, it currently is a natural consequence of "claim to do X in a browser" without access to the underlying primitives necessary to do X in a standards-compliant manner.

Sure, so don't support it. If no one supports the protocol or the app, then it will die. That's an acceptable outcome to some. I'm not saying anyone has to support it, I don't even have any vested interest in it. It's just an interesting project and I've spent far too long today on HackerNews defending it when I could be working on my own projects.

To me, it looks more like "Adding Bittorrent into Firefox is pretty much impossible, where can we start instead?"

>In my opinion browsers need to provide thinner abstractions over posix apis (with the appropriate sandboxing and opt-in where necessary).

NaCl & Emscripten should get you some of the way there.

> To me, it looks more like "Adding Bittorrent into Firefox is pretty much impossible, where can we start instead?"

Maybe with better browser-native interaction? A house starts with its foundation.

And calling it a torrent client while knowing that it isn't interoperable still leaves a bitter taste with me.

> NaCl & Emscripten should get you some of the way there.

they don't provide sockets or filesystem access as I understand it.

I already have many choices of client. All of which can correctly interact with each other. A new client come along and can't do that. Suddenly every other client is broken. Please, this client is broken and needs fixing.

>Please, this client is broken and needs fixing

Good thing it's open source then.

They interact with each other because of BEPs. This client is a prototype and if it's successful you'll have a whole new source of peers.

And before PeX was a BEP, before Vuze had a Mainline DHT plugin, it didn't interact with some clients either. Hell, even DHT [1] had to go through the same BEP process. Do you expect all software to start out complete?

1: http://www.bittorrent.org/beps/bep_0005.html

By that logic though you can say anything is BitTorrent if it gets a BEP. IPX/SPX networking only? No problem, just needs a BEP! ;) You must admit it's certainly different to start off with a completely incompatible client, vs just a new feature. (I do wish WebTorrent well though.)

Do you have any insight into the BT dev community as to whether WebTorrent is likely to be picked up in other clients?

Well yeah, someone could propose IPX/SPX but it'd be unlikely to to get picked up.

I have no special insight but I think this is a really cool idea. What I do know is if they don't put up a BEP then it won't go anywhere.

> Do you expect all software to start out complete?

No. I would expect a new torrent client to start with tracker announces and then peer communication.

WebTorrent isn't trying to embrace and extend BitTorrent. It is impossible for code running in a browser to speak any of the existing BitTorrent protocols, so as a last resort they are running over WebRTC.

PDF.js allows for an inline/controlled display of a PDF, where you can control the interaction in a browser. Given, this is of limited use, and there are other solutions.

That said, the browser will not return to only acting as a display conduit with no interactivity despite what some people would like to see...

If you want a better platform than the browser for server-driven interactive applications not requiring you to install extra applications, then build one. To date, there hasn't been a better, more open option.

The stronger argument for pdf.js is that you get to leverage the Javascript sandbox instead of having to sandbox a native code binary (or worse, support a plugin interface and hand over the keys to the system).

Except they, naturally, found that all this sandboxing was a tad too restrictive, and ended up adding Apple-esque exceptions for their own code.

And in doing so, a critical vulnerability:


They are not "Apple-esque exceptions for their own code". All JS code running as browser extensions have those permissions.

I think I know where you're coming from with the numerous amounts of JavaScript heavy software being popularised.

However I think you would have more success in A) gaining an understanding of what someone is trying to achieve / what problem they might be trying to solve and B) sharing your likely very valid point of view / experience with others if you change the way you communicate your reply.

The way you worded your reply could be hurtful to the person that came up with the idea and spent their time to try and create something from it.

I'm sure your intentions are good, probably a mix of wanting to educate people, share your experiences and show heartfelt frustration with certain types of software trends or implementations.

I think it would be more constructive and less potentially hurtful if you focused on talking about how you could see issues with using X to do Y and perhaps offer a suggested alternative path to look at if they're interested.

Tldr; If someone is demoing their ideas or product - friendly constructive criticism and feedback is very important and valuable. Keep comments positive and your words will have more of an impact.

I don't really care that much about my tone, especially in this case. My constructive criticism is not all that constructive: Stop. As I said elsewhere is is not a new idea but it is still a bad idea and a broken implementation.

As for software trends: all hope is lost.

> As for software trends: all hope is lost.

Sad but true. The rate at which software quality is decreasing is increasing all the time. e.g. even though I haven't upgraded my phone's OS for over a year (I have tried, but the newer versions are always broken in some important way!), Google pushes out updates to Play Services that I can't block that cause crashes, dead batteries, broken apps and features, etc.

This is why, more and more, I gravitate toward not only FOSS software, but stuff like Emacs that Just Keeps Working. It seems like hardly any software developers care anymore about making things that actually work. Imagine if modern cars broke as often as most computer software does. (And as cars come to depend more on complex software, they are going to get less reliable--just as they were getting to be very reliable.)

Modern JavaScript engines can compile the code to an intermediate form (or even to machine language) which may allow performance roughly equal with compiled languages like C.

While JavaScript is probably slower and use much more resources, it's not always the case.

I would also argue that bit-torrent doesn't take up much resources, at least when only seeding a few files, I guess you would have to seed thousands of files for performance to matter.

I know what javascript engines tout in their feature lists. I also know how javascript heavy websites/webapps behave. They don't inspire great confidence.

The IRC-like chat on Social Savannah will run up your browser memory and until you run out and crash. pdf.js (in firefox), as mentioned, starts sucking down cycles and bytes with just intel reference manuals open.

It's a proof of concept.

popcorntime on iOS!


Browsers are some of the most complex and [therefore] vulnerable applications. Implementing more functionalities into them is really bad for security.

Reinventing the wheel.

Rather than restricting what browsers can send and making kludgy workarounds that waste resources, why don't we just allow browsers to send whatever they want?

Sadly, that weaponises the browser into a tool for portscanning and exploiting the user's local network.

And you can't gate it behind a permissions dialog like Geolocation or something? Please.

The WebRTC guys were really, really, against this. They wanted P2P data channels to always be available, saying that a permissions dialog would confuse people and be too pervasive.

But when I asked for a concrete example that doesn't involve audio or video, I literally got "suppose you are using the web UI on a refrigerator, you might want a data channel to go direct to the fridge instead of to the webserver". At least BitTorrent is a more reasonable example.

Furthermore, data channels break the security model of a browser just using HTTP as configured, as WebRTC bypasses your proxy settings without notice.

WebRTC data channels should NOT be enabled by default and should cause a dialog, as shitty as that is. My approach is to disable it and stick my head in the sand and hope it'll go away, but people seem irrationally excited about it. Basically WebRTC right now is shittier than NetMeeting (1996) but "cool".

Browsers and the web were never http-only.

I'm assuming you mean FTP and other protocol links? AFAIK, all those respect the proxy settings and fall into the same basic security mental model.

I was just addressing (nitpicking?) your characterisation of the browser security model. Some protocols do use the HTTP proxy settings, but eg mail and news protocols don't. Granted those have been phased out of many browsers, but traditionally the browser security model was never "everything goes through HTTP proxies".

Off the top of my head I can't think why WebRTC wouldn't work with normal SOCKS proxies though, like NNTP and mail protocols did.

Thank you for succinctly explaining most of the reasons I dislike WebRTC.

Now if someone could explain why I'm getting downvoted to hell...

10 points gone so far. I've never experienced this. Are people really that uncomfortable with opinions? Perhaps I've slain someone's sacred cow :)

Probably because adding full networking to browsers is an old topic and probably never gonna happen due to the abuse concerns mentioned.

Plus permissions dialogs sorta don't work, especially for something like networking. "This page wants to use the Internet: Allow?" would just confuse the hell out of users. The alternative, acknowledge that WebRTC data should be a really limited use case, doesn't appeal to the authors/implementors.

> sacred cow

Yes: javascript and browsers. But its just imaginary internet points, they don't really matter that much.


Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact