This project, and lots of others, would really benefit from a standardized crypto Javascript API in browsers so there is an opportunity for using better hashing algorithms than are realistic to implement in Javascript.
Im not saying its realistic to engineer sha1 collisions to serve up malicious content on a platform like this, but its getting closer every year.
The platform my company is working on is similar to cacheP2P. We wrote our own custom PKI implementation on top of the WebCrypto API that doesn't rely on SHA1 for hashing.
For fun a few months ago, I implemented BLAKE in JS. It's a state of the art hash function developed by djb.
It was cool seeing how hashing works under the hood. I used UInt32Array for speed. It is still nowhere near native speed, but fast enough for many applications.
In particular, SubtleCrypto [0] seems to be what you're looking for. It's in the latest versions of FF/Chrome/Edge/Safari (and according to MDN, shipped with Edge last year and has been supported by Chrome/FF/Safari for roughly 2 years* [1]).
I can't find any support for Opera / IE, unfortunately, although Opera's been based on Chromium since 2013, so... presumably it's been supported for years, as well?
* Cross-referencing against browser release dates, that is.
Hot-cache reloading the page, the P2P set up time is less than three seconds.
And yes, you could see what resources which people are downloading. It isn't a replacement for a server, just a great way to avoid the Slashdot effect. Definitely not privacy-focused, but I could see some smaller websites using it.
So you have to load the page from the server in order to load the javascript so you can load the same page from the client pool?
Is this supposed to be more of a bootstrap process? Load the first page from the server and then you can use the URL hash to request additional pages from the connected clients?
ICE failed, see about:webrtc for more details(unknown)
TypeError: asm.js type error: expecting argument type declaration for 'e' of the form 'arg = arg|0' or 'arg = +arg' or 'arg = fround(arg)'
Same thing in chrome, and the performance of the page is pretty horrible. It's frozen the whole browser a couple of times forcing me to close the window.
This has made me rethink a lot of things. I think video on the web should probably use the webtorrent approach primarily. Thanks for posting... gonna go work on some interesting approaches now.
I dont think bittorrent is particularly suited for small file caching and thats how a lot of video is served these days (think about how adaptive rate streaming works, a video is split into thousands of small files containing a few seconds of video at certain bitrates).
The number of files is not an issue, the only potential issue is the total size of the content (not what you'd think: the bigger a torrent, the better). It is perfectly doable to create a single torrent for the full video with all known bitrates, and have the client progressively load data as needed, switching from bitrate to bitrate on-the-fly
Seems nice, like a hybrid between ZeroNet and WebTorrent, which it probably is given that the author is the wonderful Feross Aboukhadijeh, the creator of WebTorrent.
Great work. Looking through the documentation, I realized that this uses a modified version of webtorrent in order to associate torrents with urls rather than a page's content. I wonder would be necessary if BEP46 were implemented (https://github.com/feross/webtorrent/issues/886) in web torrent?
I just wrote a simple Wordpress plugin which utilizes http://cachep2p.com's new CacheP2P technology --> http://keplabs.me/cachep2p.zip | The simple for now (I will update this plugin to automate the rest of the functionality so no manual work is required) plugin will insert the necessary files in the footer of your Wordpress website... Cheers!
I wonder how this could be set up to work on error pages; could this keep pages working even if the server returned a a 503? Such a concept would be pretty useful for a media company, like the one I work for, which mostly serves static content.
It seems it already does. "Now any browser can act as a server to other browsers, so content can now be delivered even if the main server is completely down, just by getting it from the users who alrelady received it. "
Same as previous sites that demo web P2P: the site crashes my browser. I tried opening the API page, it rendered fine but then the entire browser froze. P2P pages are very far away from being production-ready.
Im not saying its realistic to engineer sha1 collisions to serve up malicious content on a platform like this, but its getting closer every year.