Hacker News new | comments | show | ask | jobs | submit login
Ask HN: What is your preferred method of sending large files over the internet?
57 points by jedimastert 11 days ago | hide | past | web | 57 comments | favorite
If you have, say, a several GB file (too large for something like email or standard Dropbox, but not big enough to warrant at specialized solution) and need to send it to an individual or a small group of people, without being able to physically hand them a thumb drive, how would you personally go about doing so?

This seems like it would be a solved problem for the day and age, but I've yet to really find a good solution.






First you base64 encode the file so you can print it. Then to enable multiple recipients you can fax it. But I prefer to put the paper file into a glass bottle and throw it in the ocean.

If you're open to suggestions I think using rfc1149 would be more efficient than the bottles.

Haha! Excellent, I had no idea there is an RFC for that!

this needs to go up

I just put these on my server in a directory with a random name, like https://mydomain.tld/files/$(pwgen -s 22 1)/file.ext. Works very well.

The problem is worse when they have to send something back to me. I considered running a simple web app that allows you to upload a file if you know the secret URL like above but never got around to actually build it. Someone probably did though.


I did. I wrote a small ASP.NET Core based application called IFS [1] which works on both Windows and Linux. It allows you to upload files yourself (via login behind password) or create upload links which you can share with your users so they can upload files. Downloads and uploads can be set to expire and the file limit is limited to whatever an upstream reverse proxy (if any) allows. Its simple click and download - or click and upload any requires no other dependencies other than Javascript support.

https://github.com/Sebazzz/IFS


The dependencies here are killer. A quick walk-through of setup for popular platforms would go a long way!

DL (http://www.thregr.org/~wavexx/software/dl/) allows users to upload files and send links out, or users to generate links ("tickets") that others can use to upload files.

This is what I do.

I have a local VM at home running a web server that can hold such files, but in AU decent uplink speeds are mostly a fantasy, so in the case OP described of needing to send to multiple people I'd upload it first to a VPS I keep that serves a similar purpose. This way I only flood my uplink once.

Obfuscated path for some very basic privacy, though of course you can encrypt and email password if that works better.

If it's going to happen semi-regularly, and the files are large and there are multiple recipients, then I'd look at BitTorrent. Most people have a client installed, or can install one fairly painlessly. Biggest challenge there would be that 6881 (etc) are often blocked, or at least monitored, on corporate firewalls.



You can also check out jirafeau, quite light and easy to install: https://gitlab.com/mojo42/Jirafeau and demo site at https://jirafeau.net

Would be nice if there was a tiny web service that could set up peer to peer file transfer through webrtc by making an obscure path available at the push of a button. Not sure how secure that would be though.

https://wetransfer.com

Often sending to non-technical people and this I have found is the easiest solution they understand. Click and download.

Would use Google Drive but it makes the download process rather complicated and non-obvious.


Wetransfer is a cancer. In my organisation I keep being called because the links are expired and people haven't downloaded the files. They also use it for sensitive information (links are sent through clear text mail).

We have NAS for local file transfer but the convenience of Wetransfer trumps that.


How did they manage to set-up a NAS that is less convenient than WeTransfer?

Well. Users don't actually know what a file is. They can click on buttons in webpages but crawling their way through networked drives is too much.

Resilio Sync if they are willing to install software.

And Mozilla Send is also nice... https://send.firefox.com


Firefox Send: Private, Encrypted File Sharing | https://news.ycombinator.com/item?id=15448996 (Oct 2017, 258 comments)

transfer.sh [0] will handle files up to 10GB but it is not p2p, which can either be good or bad. Files are purged from the server after 14 days.

Someone here mentioned ShareDrop [1] recently which looks really nice, although I haven't used it yet.

[0] https://transfer.sh/

[1] https://www.sharedrop.io/


Resilio Sync (formerly Bittorrent Sync) can be used to serve files from your own machine: https://www.resilio.com

SyncThing (https://syncthing.net) is an open source alternative too.


Glad to see btsync renamed since it wasn't open source

https://wetransfer.com is reliable for files that just need to be transferred - has an upload limit of 2GB per transfer in the free tier.

Magic Wormhole. It allows two users to send each other a file, and they find each other via a short, English language passphrase (think “horse battery staple”).

It’s a command line python utility so might not be the best solution for completely non-technical people, but I love it.

https://github.com/warner/magic-wormhole https://techcrunch.com/2017/06/27/magic-wormhole-is-a-clever...


Yes! I absolutely love magic wormhole!!!

rsync is by far not the most efficient solution out there nowadays, but it cannot be denied that it does work.

It can run over ssh ('rsync ... user@host:/path/to/file' will pass the 'user@host' bit to ssh and then use the resulting link) or rsync (which is unencrypted but a fair bit faster).

What I do to move (not copy) data around my LAN and know that the file has transferred okay is to

  rsync -P -a -z file rsync://server/module/path
and then after this has completed

  rsync -P -a -z file rsync://server/module/path -c --remove-source-files
The first just dumbly copies the data; the second uses checksum diffing to copy the failed bits.

If you have a pathologically bad network, you could use 'rsync ..... -c' and only when rsync says there were no differences use --remove-source-files. (You can get transfer statistics with the following extra commandline parameters.)

I also add

  -v --info=all --debug=all
onto the end of my commands, because I like the extra info this spits out. Protip, DO NOT use -vv (ie verbosity level 2) with info/debug=all, it will tell you about every single hash block match :D

For large files the comparison will take some time because rsync uses a fairly dumb design and a very slow checksum algorithm (there are faster alternatives with better (lower) collision rates out there). But rsync is like perl, it will almost always exist where you need it.

I don't consider rsync a particularly modern tool, and it's given me a lot of ideas on how to write a better implementation at some point. But it's a ton easier than sha1summing hundred-gigabyte tar files, which is what I used to do... >.<


There's several https://www.mailbigfile.com/ -like freemium services. They upload the file and send the recipient a link that expires in 7 days.

https://send.firefox.com/ allows 1GB and expires after 24h.


Syncthing (https://syncthing.net/) works well for tech guys. It's oss and bidirectional.

My vote goes for Syncthing too, simple single binary install and works with no configuration. I've installed it on my desktop, vps, laptop, and phone use it instead of Dropbox now.

I've even got two non techy friends to install it (albeit talking them through it) and it still works to this day.


You left off one fact that is very important in picking an answer:

Does the large file need to remain private (i.e., as in only be readable/usable to the individual or small group)?


BitTorrent is not a bad option...

Ours is a commercial service, but we built https://www.bigfilebox.com to solve this problem 10 years ago and we're still going strong. Our main market is architects and engineers who needs to send files like this every day, to specific groups of people, and be sure only those people can see the files sent.

https://instant.io/ is an easy P2P option over WebSockets.

I went to use this last week, discovered a major limitation - the only CLI client that I found, webtorrent-hybrid, requires an X server to be running.

facepalm

https://reep.io/ seems similar

https://mega.nz

End-to-end encrypted and does private and public sharing.

50 GiB free per account


Burn it on a DVD or Blu Ray, mail it to them.

If they're here in London I generally try to meet them in person with a USB device, which is usually the most pleasant option.

Otherwise I usually ssh or rsync the files to my webserver. Sometimes this seems horrifically slow though, for reasons I've never worked out. Maybe some kind of throttling somewhere.


https://ge.tt is an underrated service IMO. Very simple and to the point. A core feature is that your recipient can start downloading as you upload it. I never understood their business model though, it appears they don't have one.

If sending to someone, I just sym link it to the webserver dir on my workstation, in a non-indexable directory with a non-trivial name.

If receiving, I create a shell account on my workstation for them, and have them scp it. If they're non-technical, I tell them to download a GUI SCP client for their platform.


https://www.file.io/ is the simplest most convenient way to do this.

They also have an API: https://www.file.io/#one

They limit you to 5GB for the free version.


Upload to S3, generate presigned URL, distribute presigned URL (with optional expiration)

Amazon S3, or one of the API-compatible alternatives.

Most desktop FTP clients support the S3 protocol, so it’s convenient: you can just treat it like an FTP server with infinite disk space and file-level control over which files are visible via HTTPS.


Mail Drop works well up to 5 GB: https://support.apple.com/kb/ph2629?locale=en_US. You need to be on a Mac though.

mega (kim dot com's successor to megaupload) would be good for you. It's like dropbox, but encrypted iirc, and gives you 50 gigs right off the bat. Only issue is speed, but shouldn't be too bad.

Note that Kim Dotcom publicly outed MEGA as an untrustworthy platform after its purchase by an unnamed Chinese buyer.

I've also read observations that the encryption wasn't up to scratch as well.

Totally use it for random file sharing, sure, but use your own end-to-end encryption if you really want it, and if you think it's necessary add your own anonymization into the mix.


I have minio running on a raspberry pi at home and exposed it via a port forward.

It has built in support to generate links that automatically expire.


Google Drive for public, Dropbox for private, Keybase for self.

If it's real huge, you could use Bittorrent. Or you could rent a VPS and host it.


something like file.pizza is "alright"

I upload to S3, which gives me the option of using either a public or a signed URL to share it

Google Drive/One Drive

worse comes to worse, server on azure or EC2.

Encrnypted archive with modified SOF is ok for me.


Fedex.

Probably over kill for GBs on a thumb drive, but mailing ssds is convieniant for TB+ data sets.


Serve it with nginx, as a bonus you get logs to see who accessed it.

In this vein (and the current pricing kerfuffle notwithstanding), Caddy is great.

Just cd into the directory and "caddy", no config needed.



scp or rsync if ssh is possible, otherwise bittorrent.

I did this.

I had to send 10 GB of the file once to my friend. I sent it via torrent.

https://lifehacker.com/5534190/how-to-share-your-own-files-u...

But, I encrypted the file and sent the password to decrypt via email.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: