* https://wormhole.app/ (my recent fave, by creator of WebTorrent, holds for 24h, https://instant.io by same)
* https://file.pizza/ (p2p, nothing stored)
* https://webwormhole.io/ (same, but has a cli)
* https://www.sharedrop.io/ (same, does qr codes)
* https://justbeamit.com/ (same, expires in 10 minutes)
* https://send.vis.ee (hosted version of this code)
* https://send.tresorit.com/ (not p2p, 5 GB limit, encrypted)
I track these tools here: https://href.cool/Web/Participate/
- Intel integrated graphics: 60% reduction
- AMD Radeon: 40% reduction
- Apple M1: 10% reduction
What was the change? We removed "opacity: 85%" from the <canvas> element. We were using opacity to slightly darken the animation but now we just darken the texture image directly.
GPU utilization when window is full screen, 4K resolution:
- Radeon: 70% -> 15%
- M1: 65% -> 29%
- Radeon: 60% -> 8%
- M1: 55% -> 23%
- 15% Radeon + 75% Intel -> 8% Radeon + 8% Intel
- M1: 75% -> 35%
So we reduce the frame-rate when the wormhole is not warping (render every 3rd frame). We also lower the resolution and scale it up.
I really don't get the reasoning. It looks kind of cool, but it makes it super unusable for a bunch of use cases from very old computers to interactive sessions on raspberry pis to constrained VMs. And those are exactly the kind of places where I want friendly easy tools to copy files across for quick system admin or to get logs back out! Doesn't seem like a good tradeoff.
rclone serve http ./dir/or.file --addr :9000
File/s can be grabbed with curl/get etc or a browser
What OS, browser, and connection type/speed did you use?
I don't mind for the upload to be slow, I just let it run, but getting to the page should not be.
I mean, this galaxy.jpg file is a third of you webpack JS bundle size!
It has no added value for me. I get the cool factor, I know it's pretty, but still.
I remember trying many of those services and I decided to use this one because I could send large files without any problem (was trying to move sqlite dbs that were several Gbs, as it seemed to stream the file instead of trying to store it on ram first, but now I see wormhole.app allows up to 10GB, and I don't remember to have any limit.
WebRTC services seem to have problems to get up to speed, but for streaming files between devices it seems the best solution in terms of friction.
The symptom seems to be that the SCTP data rate drops with increasing latency (which used to be a problem with very old TCP implementations too, but all modern ones handle high-latency networks much better).
I have gotten feedback about the performance about Pion's SCTP implementation as well. It is a hard problem. The amount of people who care about SCTP performance + WebRTC and are able to work on it is very small.
If anyone is interested in more about this  is a fun issue to read.
libwebrtc is also planning to stop using libwebrtc soon. That would mean all browsers (Chrome, Safari, FireFox) will be on something new. Could be good no idea yet. The ongoing work is here 
Curious to see how the new implementation will play out for the browsers!
I‘m currently trying to make it interoperable with https://share.ipfs.io/#/ which resembles the functionality of the posted tool.
How frequently do you validate that they are still functional?
I tried File Pizza several months ago, and neither I nor the recipient could get it to work.
Feel free to ask any questions.
Want to try it out? I've a public instance at: https://send.vis.ee/
Other instances: https://github.com/timvisee/send-instances/
A docker-compose template: https://github.com/timvisee/send-docker-compose
One thing I was wondering is if/how expired files are cleaned up. I uploaded a large file, set it to expire after 5 minutes, and although I can't download it anymore I see that it's still in the files directory on my server.
I glanced through the code, but I didn't see any mechanism for periodically purging expired files or anything like that. Is there something that I missed, or should I just set up a cron job or something to delete all files in that directory older than a week?
You're right. Expired files that don't reach their download limit are kept on the server. Due to implementation details there is no 'nice' way to do this from Send itself. If using S3 as storage you can configure a LifeCycle Policy, if using raw disk storage you can set up a cron.
See an example here: https://github.com/timvisee/send-docker-compose/blob/master/...
All uploaded files have a prefixed number which defines the lifetime in days (e.g.: `7-abcdefg` for 7 days expiry). So you can be a little smarter with cleaning up.
I should describe this clearly in documentation.
I'd like to understand the reasoning behind this. Thanks.
Someone asked this before, here is my answer (bottom quote): https://github.com/timvisee/send-docker-compose/issues/3#iss...
It really depends on who is hosting it.
Send itself doesn't really log anything except for errors. A reverse proxy in front of it might be used for an access log, which is default with the docker-compose template for it. Files are always encrypted on the client, and it or its keys are never seen on the server.
If you're wondering for the instance I've linked: it runs on Digital Ocean. I have an access log (IP per request, for 24h), I can list encrypted blobs and their metadata (creation time, size), and that's pretty much it.
I run a home server just for internal use and it might be nice to send files via a link for memes, jokes, quick one-shot uses rather than storing it on a samba share, etc, but it doesn't have a public-facing URL for confirming a LetsEncrypt certificate.
Already if you give me a plaintext HTTP link I'm going to have to consciously decide that's fine and click past the interstitial warning me it wasn't able to be upgraded to HTTPS, if you use it to inject an image somewhere that's otherwise HTTPS, the image just counts as broken unless I go out of my way to authorise it.
For private instances could there be an option for requiring a login before upload?
In the last year, I've had 1 DMCA request. And I've blocked one IP that was uploading half a terabyte.
> For private instances could there be an option for requiring a login before upload?
Not built-in, right now. But you can easily set up HTTP Basic Auth on a reverse proxy that you put in front of it.
But with infinite time, I'd:
- add some form of authentication, to limit uploads for example
- add a way to preview files on the Send page itself
- provide integrations with other platforms
- resolve outstanding issues
Maybe adding HTTP basic auth is fine, as I mainly want to keep random bots from finding the service. I'll try that, thanks!
I have not seen any bots on my public instance by the way. It has been running for more than a year.
When visiting the URL, the key never reaches the server because the hash-part of an URL is never sent and is a local-only thing. So there's no need to strip logging. The client downloads the encrypted blob, and decrypts it on the client.
More info: https://www.reddit.com/r/firefox/comments/lqegb5/reminder_th...
However, it needs to be hosted somewhere.
...and if I'm going to be using a hosted service, I'd like the ability to easily pay for it (so that it doesn't eventually collapse or resort to shady things like ads), either though donations or microtransactions for bandwidth/storage.
Unfortunately, there's no good microtransaction service.
Wasn't Mozilla working on one? Where did that go?
...and thus, we've gone full circle.
And I'm typing this comment in a Chrome browser, because my company is migrating away from Firefox due to "security issues".
The https://send.vis.ee/ is mostly funded by donations right now. I do not plan to take it down, unless the cost becomes a problem. I'll never resort to ads.
If this ever happens, I'll likely show a warning beforehand. Some time later I'll disable the upload page, and will take the rest of it down the week after. Files have a maximum lifetime of a week anyway. So if you discover this when uploading, you can simply switch to some other service. Existing links should not break.
There's a donation link on the bottom of the page (https://vis.ee/donate). But feel free to use it without a contribution.
More generally, I want the ability to make microtransactions (substitute "extremely low-friction donations" if you will) for everything that could be "free" but also costs money (bandwidth, compute, storage), because no matter how much free time I have, there will always be services that I could benefit from, but are low-enough-value that it's not worth it for me to self-host or get a cloud host myself.
Quick summary: it was being used for malware and phishing, aggravated by the trustworthy-seeming firefox.com URL.
So, yes, incredibly naive.
I can imagine spam being a problem with such a service with a well recognized brand name.
I don’t know. The internet had hundreds of file sharing sites at one point. They all suffered fates similar to the epic MegaUpload although with not as colorful founders as Kim DotCom.
I don’t see how having them again would be different than last time?
Well it doesn't matter as much in this case because "Send" is a temporary file host.
Also, does it need to put the whole file in RAM first?
Regarding your second point: I’m actually not sure if the file is copied into memory or if the browser just keeps a reference. I haven’t tried it with large files yet.
I'm curious about "FreeMarker" being the top language so I clicked on it, surprisingly it returns zero code: https://github.com/timvisee/send/search?l=freemarker
For more info see: https://github.com/conwnet/github1s
If you look at github/linguist, that's what recognizes languages in repos. It has this rule for FreeMarker: https://github.com/github/linguist/blob/32ec19c013a7f81ffaee...
It seems a .ftl extension means FreeMarker to linguist, so those localizations show up as such.
If you run `git check-attr --all public/locales/foo/send.ftl` with the current .gitattr file, you'll get no attributes.
If you update the attr match to `public/locales/**` or `public/locales/**/*.ftl`, then the `check-attr` command above will match it and show 'linguist-documentation'.
The parent repo doesn't have this issue. The "languages" list doesn't mention Freemarker on https://github.com/mozilla/send
Just curious, since I keep seeing Wormhome mentioned, but I never seem to see anyone mention Transfer (unless it's just a lesser known option and I happened to hear of it early).
When the same project is hosted by someone that I don't know, I can't be sure that they won't modify it to peek at the files (I'm not going to perform a full code audit on every page load).
Some services are 1:1 ratio. That is, uploading a file results in a download that only works once. So that makes them rubbish for malware, you have to be spear phishing somebody and even then it buys you less than using Tor would.
Some services are only encrypted in transit. So bad guys can't intercept or alter the data, but at rest on your server it can be scanned for malware, copyright infringement, whatever the provider wants to scan for.
Some services cost money to use which is an obstacle to bad guys who most likely want more money and not to be paying money up front first.
Firefox Send was encrypted in situ (the keys live only in clients, so the server doesn't know your keys), it was free to use, and it allowed either unlimited or very large ratios.
So that makes it potentially very attractive. On top of which, it has this nice trustworthy Firefox name. Grandma Jenny's kids have told her not to go around installing stuff from just anywhere, but they did tell her _Firefox_ is trustworthy after she got flustered when it auto-updated. How is Jenny supposed to understand that this link to Firefox Send isn't Firefox?