Hacker News new | comments | show | ask | jobs | submit login
CacheP2P – Distributed caching platform (cachep2p.com)
169 points by feross 187 days ago | hide | past | web | 57 comments | favorite

This project, and lots of others, would really benefit from a standardized crypto Javascript API in browsers so there is an opportunity for using better hashing algorithms than are realistic to implement in Javascript.

Im not saying its realistic to engineer sha1 collisions to serve up malicious content on a platform like this, but its getting closer every year.

The platform my company is working on is similar to cacheP2P. We wrote our own custom PKI implementation on top of the WebCrypto API that doesn't rely on SHA1 for hashing.

I go into more detail about our platform in this comment: https://news.ycombinator.com/item?id=12758084

For fun a few months ago, I implemented BLAKE in JS. It's a state of the art hash function developed by djb.

It was cool seeing how hashing works under the hood. I used UInt32Array for speed. It is still nowhere near native speed, but fast enough for many applications.


In particular, SubtleCrypto [0] seems to be what you're looking for. It's in the latest versions of FF/Chrome/Edge/Safari (and according to MDN, shipped with Edge last year and has been supported by Chrome/FF/Safari for roughly 2 years* [1]).

I can't find any support for Opera / IE, unfortunately, although Opera's been based on Chromium since 2013, so... presumably it's been supported for years, as well?

* Cross-referencing against browser release dates, that is.

[0] https://www.w3.org/TR/WebCryptoAPI/#dfn-SubtleCrypto

[1] https://developer.mozilla.org/en-US/docs/Web/API/SubtleCrypt...

Isn't that what Akamai used to do and then they realised people aren't happy that their bandwidth is used for these reasons without their consent?

I believe it was peercdn who had developed something like this, until yahoo acquired them.

Might have more luck if people are getting paid for the bandwidth that is consumed.

Couldn't someone create a large number of sybils in this network to see what pages people are loading? Browser sessions could be inferred from this.

Also, how much latency does this add due to P2P connection set up time?

Hot-cache reloading the page, the P2P set up time is less than three seconds.

And yes, you could see what resources which people are downloading. It isn't a replacement for a server, just a great way to avoid the Slashdot effect. Definitely not privacy-focused, but I could see some smaller websites using it.

So you have to load the page from the server in order to load the javascript so you can load the same page from the client pool?

Is this supposed to be more of a bootstrap process? Load the first page from the server and then you can use the URL hash to request additional pages from the connected clients?

I think you load the (hopefully small) HTML page from the server, but then you load images, videos, and other large assets from peers.

I'm getting this error on your API page:

  cachep2p.min.js:9 Uncaught Error: Cannot add duplicate torrent 3cd9bdf4916f422aa77cdd1b952c14202e0eafca
I'm also getting these on FF:

  ICE failed, see about:webrtc for more details(unknown) 

  TypeError: asm.js type error: expecting argument type declaration for 'e' of the form 'arg = arg|0' or 'arg = +arg' or 'arg = fround(arg)'

Same thing in chrome, and the performance of the page is pretty horrible. It's frozen the whole browser a couple of times forcing me to close the window.

Same here. Chrome Version 53.0.2785.143

The whole browser becomes almost unresponsive.

Interestingly, I have no issuse on mobile Chrome.

Same here.

I'm not getting any of these errors and page is very fast.

That kind of errors should better go in github/issues than in ycombinator, so we fix them asap: https://github.com/guerrerocarlos/cachep2p/issues

No problem for me with Chromium on Linux. Page is fast.

I'm getting the first error on Chrome, the page loads, but i can see that error on the console.

My Chrome 53.0.2785.143 froze within 5 seconds of opening cachep2p.com.

This has made me rethink a lot of things. I think video on the web should probably use the webtorrent approach primarily. Thanks for posting... gonna go work on some interesting approaches now.

I dont think bittorrent is particularly suited for small file caching and thats how a lot of video is served these days (think about how adaptive rate streaming works, a video is split into thousands of small files containing a few seconds of video at certain bitrates).

The number of files is not an issue, the only potential issue is the total size of the content (not what you'd think: the bigger a torrent, the better). It is perfectly doable to create a single torrent for the full video with all known bitrates, and have the client progressively load data as needed, switching from bitrate to bitrate on-the-fly

I believe that bittorrent works by splitting a single file into thousands of small files, so this may, in fact, be ideal?

Ops point, I think, was that you couldn't switch rates mid-stream.

bittorrent doesn't split into files, it splits into pieces and chunks that are invisible to the user, so that's not something you can use as a user.

Seems nice, like a hybrid between ZeroNet and WebTorrent, which it probably is given that the author is the wonderful Feross Aboukhadijeh, the creator of WebTorrent.

I like it, but doesn't this inherently expose all currently connected users' IP addresses to all other users?

In other words, if I connect to the webpage and just monitor the swarm, can I log all IP addresses that accessed that page?

Theoretically though it would depend on the size of the swarm and how many peers you had. Tor doesn't support webrtc either.

Great work. Looking through the documentation, I realized that this uses a modified version of webtorrent in order to associate torrents with urls rather than a page's content. I wonder would be necessary if BEP46 were implemented (https://github.com/feross/webtorrent/issues/886) in web torrent?

Would it be possible to make a completely serverless webpage? Or a website that has a server to bootstrap, but all assets are 100% distributed?

I remember something like this appearing here: http://ephemeralp2p.durazo.us/


Can I use qbittorrent/deluge to seed webpages?

Libtorrent doesn't support webrtc yet, vuze and web torrent desktop client support it.

I just wrote a simple Wordpress plugin which utilizes http://cachep2p.com's new CacheP2P technology --> http://keplabs.me/cachep2p.zip | The simple for now (I will update this plugin to automate the rest of the functionality so no manual work is required) plugin will insert the necessary files in the footer of your Wordpress website... Cheers!

makes me think of https://ipfs.io/

I wonder how this could be set up to work on error pages; could this keep pages working even if the server returned a a 503? Such a concept would be pretty useful for a media company, like the one I work for, which mostly serves static content.

It seems it already does. "Now any browser can act as a server to other browsers, so content can now be delivered even if the main server is completely down, just by getting it from the users who alrelady received it. "

Certainly, could be set up in error pages, so that the content get's retrieved from users who already received it instead.

Would recommend embedding everything (all the HTML and JS) in the error page, so that the it works 'standalone'.

Sounds cool, but what are the use-cases? What problems is this tech trying to solve?

I have the same issue - can't think of a single one. Most of the ones I could think of are solved better with CDN or other cloud services.

Well, should all DNS servers suddenly die it might be useful but besides that..

But, yeah, still kind of cool.

CDNs and cloud services cost money to setup and maintain.

This scales asset availability with popularity, automatically, for free, with minimal setup.

You resumed it brilliantly, I took the liberty to cite you in the landing page, hope you don't mind.

Happy to help. Thanks for working on this awesome project!

Slashdot/Reddit hug of death? Infinitely scalable content delivery on the backs of your site visitors' upload bandwidth.

Would this help against DDoS attacks on a web site?

No if its content is dynamic.

Saving some bandwidth costs?

You mean transfer this cost to site visitors?

Same as previous sites that demo web P2P: the site crashes my browser. I tried opening the API page, it rendered fine but then the entire browser froze. P2P pages are very far away from being production-ready.

Is this like IPFS, without the need of off-browser IPFS daemon?

> The way internet works currently is not scalable

News to me.

The way content delivery via HTTP works isn't very scalable. The fact that companies like akamai and cloudflare exist are a testament to this.

lol all i read is "distributed xss attacks"

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact