
Ask HN: What is your favorite method of sending large files? - mettamage
I just opened up a simple HTTP server to send someone a large file. Then I figured, I never gave this question proper thought.<p>But some of you have, and I figured they make for fun and interesting stories ;-)<p>So what&#x27;s your favorite method to send large files, of at least 5GB or bigger? Though, I&#x27;m also curious on how you&#x27;d send 10TB or more.
======
livueta
Bittorrent. No, really. Lots of nice behaviors when transferring large amounts
of data between arbitrary endpoints.

Transferring the torrent metadata is pretty trivial and can be done via a wide
range of methods, and having that flexibility can be nice.

Unlike HTTP, you get reasonable retry behavior on network hiccups. Also, more
robust data integrity guarantees, though a manual hash test is probably a good
idea either way.

At least among people I'm throwing TBs of data around with, torrent infra is
common and it's nice to not have to deal with some special-purpose tool that,
in practice, is probably a pain in the ass to get compatible versions deployed
across a range of OSes. Basically every platform known to man can run a
torrent client of some sort.

And obviously, no dependency on an intermediary. This is good if you're trying
to avoid Google et al. That does, however, bring a potential con: if my side
is fast and your side is slow, I'm seeing until you're done. If I'm uploading
something to gdrive or whatever, I can disconnect one r the upload is done. If
you control an intermediary like a seedbox, that's less of a problem.

In general, though, torrents are pretty great for this sort of thing. Just
encrypt your stuff beforehand.

~~~
arsome
Just be careful - unless you mark that torrent private it'll get posted to the
DHT and crawlers like BtDig will pick it up and list it publicly.

For this reason I prefer using something like Syncthing which is designed more
with this purpose in mind.

~~~
jivank
I am working on a tool that will help with this. It is built with AlpineJS,
Nim, Aria2 and Webview resulting in a 5MB download which doesn't include the
torrent file as that varies. The idea is that it is a one click solution as
the torrent is embedded with the binary.

The inspiration for this tool is for assist with LAN parties. The one thing I
have against the private flag is that it also disables LAN peer discovery,
which would be okay if you use a tracker behind the LAN (though the torrent
would need to be modified if the hostname/IP changes). Since you can't
configure other people's clients I found it simpler to use Aria2 that is
preconfigured to disable PEX and DHT.

Combine this with Metalink or Web Seeds, so that you can have a initial seeder
based on HTTP. I think using IPFS would be a great web seed as long as the
their gateway and Cloudfare's continues to stay up. IPFS creates a permanent
URL if you will, so no need to worry about dynamic IPs or domains. It would be
great if IPFS would have a smaller binary (right now it is around 20MB
compressed) and had a way to get a file and "seed" it like a torrent. But for
now I think torrent is mature enough.

~~~
brink
Why not use Syncthing or Resilio Sync for LAN parties?

~~~
jivank
I have considered it. The one thing about Syncthing is that there is no true
readonly way to sync. For example, someone may accidentally extract an archive
in the Syncthing folder and it will sync everywhere. If Syncthing gets this
feature I think I would be 100% onboard with it for LAN parties.

Resilio I believe supports it but I would prefer an opensource alternative.

[https://github.com/syncthing/syncthing/issues/62](https://github.com/syncthing/syncthing/issues/62)

~~~
csdreamer7
You can use the send only feature in Syncthing to make your folder read only.

[https://docs.syncthing.net/users/foldertypes.html#folder-
sen...](https://docs.syncthing.net/users/foldertypes.html#folder-sendonly)

~~~
jivank
Sure, I can have for example my computer be send only. But unless everyone
else configures it properly as receive only, they can start pushing data to
other clients. It would be far easier to have Syncthing create a readonly
share similar to Librevault and Resilio.

------
geocrasher
5GB on my local network: Windows File Share, Rsync, or HTTP depending on
source/destination

5GB on Internet: Upload to my OVH VPS and HTTP or Rsync it to its destination

10TB, local or Internet: Physical Hard Drive.

Never underestimate the bandwidth of a station wagon full of backup tapes!

[https://www.tidbitsfortechs.com/2013/09/never-
underestimate-...](https://www.tidbitsfortechs.com/2013/09/never-
underestimate-the-bandwidth-of-a-station-wagon-filled-with-backup-tapes/)

And since you now buy 1TB micro SD cards, so perhaps I'd split the file 11
ways (no way it'll fit exactly) and send them via carrier pigeon. Or heck, I
could just tape them to a drone and hope they aren't damaged in the crash.
There's lots of ways to move data around. Maybe you want to UUENCODE it and
submit bad URL's to a servers log so that it can be exfiltrated later? It
would probably take a very, very long time, but could be done. I call it
"CURLyTP"

[https://miscdotgeek.com/curlytp-every-web-server-is-a-
dead-d...](https://miscdotgeek.com/curlytp-every-web-server-is-a-dead-drop/)

~~~
terramex
_10TB, local or Internet: Physical Hard Drive.

Never underestimate the bandwidth of a station wagon full of backup tapes!_

In Poland there is an option to send package via train for 8$. Just go to
station and give package to conductor/train guard, few hours later receiver
can get it at destination station. Sending HDDs full of raw video footage this
way is very popular among video editors here.

~~~
jpxw
That’s a really cool idea actually

~~~
StavrosK
Same thing happens in Greece with buses. It's a very popular way for moms to
send home-made food to their single/university student sons.

~~~
immigrantsheep
In Croatia as well :)

~~~
toastal
In Laos the (mini)buses were the most reliable way to ship anything. We had 3
motorbikes strapped to the top and a sat cattycorner to a caged chicken on my
trek through the north in a 15-seater minibus.

------
mullen
If the data is below 50G in size and on my personal computer, then I just drop
it into the Google Drive folder and it syncs to Google over night. When it is
done, I export it and then send a link to the person(s). I pay $2 a month for
the 100G account and I usually have about 50G disk space unused, so this is
not an issue for me.

If it is above 50G and on my personal computer, I encrypt the data and then
physically mail a USB stick with the data on it. Trying to arrange the
downloading/uploading of 50G of data from my personal computer to another
personal computer is a real pain. The people that I would send/receive that
much data to/from, that is stored on my personal computer are usually people
who don't know much about ftp, scp or easily sharing files over the Internet.
Sending a USB stick is just so much easier and in many cases, faster. I make
sure the recipients know how to decrypt the data before sending the data.

If it is on a server (For example, a side project instance in AWS), then I
drop it into S3 bucket, export it and send the URL to the recipient. I just
eat the cost of them downloading it. Usually I am making money off the work
anyways, so it is the cost of doing business.

~~~
lowwave
>If it is on a server (For example, a side project instance in AWS), then I
drop it into S3 bucket, export it and send the URL to the recipient. I just
eat the cost of them downloading it. Usually I am making money off the work
anyways, so it is the cost of doing business.

Consider BunnyCDN way cheaper than s3.

~~~
StavrosK
BunnyCDN is a CloudFront alternative, not an S3 alternative.

~~~
farmdawgnation
They also offer an object storage product, it seems:
[https://bunnycdn.com/solutions/cdn-cloud-
storage](https://bunnycdn.com/solutions/cdn-cloud-storage)

------
kvn_95
Magic Wormhole ([https://magic-
wormhole.readthedocs.io/en/latest/](https://magic-
wormhole.readthedocs.io/en/latest/)) if it's between OSes.

Between OSX, AirDrop works very well. I have sent >10GB files between Macs,
quite quick as well.

I have never send a 10TB file so I wouldn't know. None of my drives are that
large yet :)

~~~
theobeers
Magic Wormhole is good. These days I use croc,[0] which I find even better.

[0]: [https://github.com/schollz/croc](https://github.com/schollz/croc)

~~~
spurgu
I don't see an option to send text (browsed through README and Issues), is
this not possible with croc? With Magic Wormhole it's the default to get a
message prompt when running "wormhole send". Or you can do it as a oneliner
with "wormhole send --text 'something'". Wormhole William can do this as well
IIRC.

I use MWH often for sending stuff like URL's, passwords, API keys... having to
create a file for that is quite annoying (and remembering to delete it
afterwards).

"brew install magic-wormhole" or "sudo apt-get install magic-wormhole" is easy
enough and I don't hang around with people who use Windows. ;)

Resumable file transfers sound like a great feature though. Not sure it's
implementable with MWH the way it now works.

~~~
qrv3w
Just added it now. Send with croc send --text "hello, world" [1]

[1]:
[https://github.com/schollz/croc/releases/tag/v8.2.0](https://github.com/schollz/croc/releases/tag/v8.2.0)

~~~
spurgu
Nice! I just tried it out now. I also like the fact that "croc" defaults to
receive, where with MWH you need to type "wormhole receive". Saves a lot of
keystrokes.

Tab completion is something I'll miss from MWH (I'm already converted!) but I
can live with it, and my non-technical friends aren't even aware of the
concept.

------
ashton314
Netcat:

    
    
        $ nc -l 4242 > dest
    

And then on the sending end:

    
    
        $ nc hostname 4242 < file_to_send
    

This works great when you just need to get a file of any size from one machine
to another and you’re both on the same network. Are used this a lot at one of
my offices to schlep the files around between a few of the machines we had.

~~~
unilynx
This is especially fun when combined with piping through tar, and adding pv in
the mix for a transfer speed "progress" bar.

Fastest way to transfer a collection of files on a local network and doesn't
require temporary storage for the archive

~~~
enriquto
piping through tar is my "power move" when I want to steer a new recruit. I
just do it casually in front of them... depending on whether they are awed or
horrified by that, they'll get more hacky or more formal work.

~~~
parliament32
You can also pipe through gpg if you want encryption along the way!

~~~
ahsima1
And through gzip/zstd/etc if you want compression!

~~~
cellularmitosis
lzop if you gotta go fast :)

~~~
namibj
zstd easily outruns a gigabit link on most somewhat-compressible data.

------
BrandonM
I remember using AIM (AOL (America Online) Instant Messenger) and other
instant messaging applications for direct peer-to-peer file transfer back in
2002. On fast university connections, you could transfer movies in a few
minutes. It's crazy to me that direct, fast file sharing was easier and more
ubiquitous almost 20 years ago than it seems to be now. More context for my
oft-misunderstood early feedback to dhouston about Dropbox.

------
jve
Private Nextcloud. Or some free tier OneDrive/alternative is enought for me.

However no-one mentioned a super simple service:
[https://wetransfer.com/](https://wetransfer.com/) \- Simple as drag & drop,
enter recipient address, SEND. Pretty simple if you want non techie to send
you something.

~~~
meigwilym
+1 for WeTransfer.

A dead simple UI and a link is nicely emailed to the receiver.

~~~
spodek
Also emails the sender when the receiver downloads it.

------
sillysaurusx
Syncthing. [https://syncthing.net/](https://syncthing.net/)

It's like a private dropbox.

For files on the order of <= 10GB, magic wormhole is lovely:
[https://techcrunch.com/2017/06/27/magic-wormhole-is-a-
clever...](https://techcrunch.com/2017/06/27/magic-wormhole-is-a-clever-way-
to-send-files-easily-and-securely/)

`sudo pip3 install magic-wormhole` is an automatic "must run this command" on
all my servers. Simple, secure way of sending files from A to B, with no
bullshit. No account creation, even.

------
anderspitman
I've been working on solutions in this space for a couple years now. IMO
making data on your local device available to others via HTTP range requests
is the sweet spot between storing your data in the cloud and going full p2p.

Here's a couple of my projects:

[https://patchbay.pub/](https://patchbay.pub/)

code[0]

Sender:

    
    
      curl https://patchbay.pub/random-channel/filename.bin --data-binary @filename.bin
    

Receiver:

    
    
      curl -O https://patchbay.pub/random-channel/filename.bin
    
    

[https://fbrg.xyz](https://fbrg.xyz)

code[1]

This one works in the browser. You select a file and it gives you a link you
can share with others. The underlying tech[2] is more complicated (relies on
WebSockets) than patchbay, but the server is currently more reliable. I'm in
the process of improving the patchbay server to be more robust, and I think it
will eventually replace fibridge completely.

My current project is building something along the lines of Google Drive, but
much simpler and designed to be self-hosted from within your home.

[0]: [https://github.com/patchbay-pub/](https://github.com/patchbay-pub/)

[1]: [https://github.com/anderspitman/fibridge-proxy-
rs](https://github.com/anderspitman/fibridge-proxy-rs)

[2]: [https://github.com/omnistreams/omnistreams-
spec](https://github.com/omnistreams/omnistreams-spec)

------
BenjiWiebe
Sending a 10TB file on my internet connection would take 2.5 years of constant
uploading. Shipping a hard drive is cheaper and quicker.

~~~
njsubedi
8x10,240,000Mbits /2.5/365/86400 = 1.03Mbs?

~~~
rsa25519
That's slightly faster than my Silicon Valley upload speed.

------
tbronchain
It doesn't have to be large files, file sharing sucks. Even copy/past across
devices, even your own devices, is a pain. How many times did I email myself
some links to share them across devices. And I'm not even speaking about
sharing pictures with people not very accustomed to technology (in other
words, don't try to get them install a cloud storage app).

I gave that question a few tries but I feel part of the problem is the market
being saturated by giants doing half the job. Another part of it is the lack
of interoperability regarding file sharing between operating systems. I mean -
a native right click>send to>user - native notification to the user with a
solution to direct p2p download from that person. No software needed, seemless
integration. Why is that so hard?

I really wish the big boys would give that a try rather than giving us toy-
like features and price bump.

~~~
StavrosK
You're gonna love magic-wormhole.

~~~
tbronchain
It's great, but we're very far from the os-agnostic seemless integrated files
sharing.

You see, we have protocols like HTTP and the web relies on it, everyone agrees
this is how the web works and all OSes have one or several browser able to
access it. Something similar works with emails. Yet, we don't have anything to
share files between computers, imagine something like bittorrent directly
integrated to all OSes, letting you right click > send to.

~~~
nicolas314
Having a generic "Send to" would only solve a part of it though. There is
currently no standard way of connecting the two ends of the same cable to two
computers and expect them to exchange files without heavy configuration on
both sides. Wireless goes the same way.

------
akerro
>I'm also curious on how you'd send 10TB or more.

For my 12TiB of data I use Syncthing when I need to sync them more often,
rsync.

I used rsync several times for billions of smaller files totalling to 300GiB,
but really all depending on how I connect nodes. I prefer syncthing, but when
only ssh is available, then rsync is good too.

Currently largest synced directory by syncthing (that shares usage stats) is
over 61384 GiB :)) [https://data.syncthing.net/](https://data.syncthing.net/)

~~~
temp007007
Is there an iOS client for Syncthing?

~~~
KindOne
No.

[https://docs.syncthing.net/users/faq.html#why-is-there-no-
io...](https://docs.syncthing.net/users/faq.html#why-is-there-no-ios-client)

------
superkuh
I have run a internet reachable whatever.com webserver from my home desktop
computer for 20 years. I just copy or softlink the file to a place in ~/www/
and give out the corresponding web link. I have a couple nginx locations
prepared with pre-existing bandwith throttles so it's a matter of soft linking
to the appropriate ~/www/dirwithassociatedrate/.

If it's for more than 1 person I upload it to a VPS if it's small (<20 GB) or
make a torrent if it's not.

------
itake
You can create a simple python http server to expose files in a directory to
the local web:

$ python -m http.server 8000

and then you can start ngrok to expose the file

$ ngrok http 8000

that will give you an URL to share with whoever wants it.

~~~
anderspitman
Does ngrok limit data at all or just connections?

~~~
itake
I’ve sent a few GB over this method using their free service. Doesn’t seem to
have a transfer limit

[https://ngrok.com/pricing](https://ngrok.com/pricing)

------
GRBurst
It depends in whether I want to send it to someone in the local network or
through the internet.

For local network:

I use miniserve (
[https://github.com/svenstaro/miniserve](https://github.com/svenstaro/miniserve)
) which is just a simple http server. There are similar tools for when I want
to share it from the smartphone.

Through the internet it really varies:

Sometimes it is Firefox send (
[https://send.firefox.com/](https://send.firefox.com/) )

For photos, I use a self hosted photo solution piwigo (
[https://github.com/Piwigo/Piwigo](https://github.com/Piwigo/Piwigo) )

In earlier days it has been a self hosted nextcloud (
[https://github.com/nextcloud/server](https://github.com/nextcloud/server) )
instance. I still use it when the files are getting too large for Firefox
send.

I also tried the already mentioned wormhole but this works only with tec ppl.

~~~
cprecioso
Firefox Send doesn’t work anymore

~~~
mkl
That's unfortunate. That was my go-to method.

From [https://send.firefox.com/](https://send.firefox.com/):

"Firefox Send is temporarily unavailable while we work on product
improvements.

We appreciate your patience while we make the Firefox Send experience better."

~~~
ornornor
I wonder how temporary it is because it’s been saying that for a couple months
already. I think that service is dead.

------
cpach
Related: If you need to transfer sensitive data over Bittorent, Age is a good
tool for encrypting it before transmission.

[https://github.com/FiloSottile/age](https://github.com/FiloSottile/age)

~~~
parliament32
Why a new tool? How is this better than gpg symmetric encryption, considering
gpg is installed/available effectively everywhere?

Encrypt:

    
    
        gpg --symmetric file.dat
        (enter a password)
    

Decrypt:

    
    
        gpg --decrypt file.dat.gpg > file.dat
        (enter the password)

~~~
cpach
IMO, GPG is simply not a good tool and it should be replaced.

Others have said it better than I can. See e.g.
[https://latacora.micro.blog/2019/07/16/the-pgp-
problem.html](https://latacora.micro.blog/2019/07/16/the-pgp-problem.html)

~~~
forgotmypw17
That's said often about any tool which has been around long enough. People
without experience come around and think they can replace an old tool with a
better one, but it's usually only ignorance of either the complexity of the
task, knowledge of using the tool properly, or both.

~~~
cpach
GPG is just not a very good tool. I think 'tptacek explained it quite well in
the article I linked.

With that said, Magic Wormhole is also a very good tool for transferring
files. It will encrypt in transit. So for many files, using a separate
encryption tool is not necessary. (So far I haven’t tried it for large files.)

~~~
ColanR
What no one seems to mention is that Magic Wormhole depends on the
maintainer's server to negotiate transmission between clients. I don't like
that requirement. Better to GPG encrypt and use bittorrent for a direct
transfer. At least then we're using public trackers instead of some private
server.

~~~
gojomo
How would public BitTorrent tracker servers be better than a single
rendezvous-server run by the tool's author?

With trackers, you're revealing the fact-of-transmission, transmission-size, &
endpoints to any number of unknown remote parties. Potentially, attackers not
even on the privileged network-path from origin to destination could tee off a
copy of your encrypted data for offline analysis.

With Magic Wormhole's rendezvous-server, only one server, run by the same
person whose code you're trusting (& can audit), briefly relays encrypted
control-messages. (It might even be limited in its ability to deduce the size
of the transfer – I'm not sure.) And if that's still too much, you can run
your own rendezvous server.

It seems to me the amount of information leaked in the BT Tracker approach is
strictly (& perhaps massively) more, to more entities, than that leaked in
using the Wormhole author's server.

------
Paul-ish
I use Dropbox to send and receive files that over the email/slack/etc
attachment limit. Dropbox has a cool "request files" feature that let's people
upload files directly to your Dropbox.

I've never needed to send TBs of data.

------
jedimastert
I asked the same question on HN about 3 years ago[0]. I'm curious to see how
the answers have changed and what might have stayed the same.

[0]:
[https://news.ycombinator.com/item?id=15440571](https://news.ycombinator.com/item?id=15440571)

------
LinuxBender
SFTP Chroot server. lftp client using mirror subsystem + sftp. It is multi-
threaded, even for a single file and supports rsync like behavior even in sftp
chroot. I can max out any internet link using lftp (client) + sftp (protocol).

------
erezsh
I'm partial to [https://mega.nz/](https://mega.nz/)

------
1ark
10TB or more, not sure. Anecdotally I did have to send GBs of home directory
locally recently. Fastest way I found out the hard way was mounting SMBFS and
`tar` it there. But of course it is unreliable without resume etc.

For the thing you did, these could have worked.

[https://file.pizza](https://file.pizza)

[https://instant.io](https://instant.io)

[https://www.sharedrop.io](https://www.sharedrop.io)

------
orionblastar
When I worked as a federal contractor in 1996-1997 we used password protected
FTP servers to store giant text files for import to a database. Later on they
moved it to a website you login and download the text file. Since it was so
long it took almost all day and if it aborted had to start all over again.

I had to keep Netscape open because it showed the status of the file download.
People asked why my web browser was open all day, it was for the giant
download that is part of my job.

------
e12e
Apparently not explicitly mentioned: rsync _over ssh_.

Windows now comes with real sshd - Mac has it, and linux/bsd of course has it.

For small files (less than 10gb?) generally _scp_ \- but trying to get in the
habit of using rsync. Generally scp isn't a very good idea.

For larger _filesystems_ zfs send over ssh.

For streaming: dlna/upnp over zerotier vpn.

Shame Firefox send service imploded - for smaller files and where ssh isn't an
option - it was a nice service. But a little too much hassle to self-host.

------
motohagiography
Bike courier with checkered past, simultaneously on the run from organized
crime clans and mercenaries working for retired spies, probably has mirror
shade implants, lives in squat, punk haircut, etc.

Didn't realize there was another way.

------
wilsonnb3
USPS

------
britmob
Resilio Sync (formerly bittorent sync) is my go-to for any file larger than a
few hundred megabytes.

~~~
misterbwong
Same here. I use this to deploy large files to my VPS

~~~
willcipriano
I third this suggestion, if you have to send the file to multiple parties they
can all share among themselves (it's BitTorrent after all) speeding up the
process even further.

------
colabiblen
WDT from Facebook -
[https://github.com/facebook/wdt/](https://github.com/facebook/wdt/)

We use it for copying hundreds of TiB's regularly, occasionally over a PiB.

------
gulerc
[https://Sendgb.com](https://Sendgb.com) is good way over the internet.
Possible to send large files up to 20 gb. No needed registration or sign in.

------
gnufx
Not something I've used, and it's difficult to countenance anything called
that, but what Globus (globus.org) has become seems to be quite popular for
transferring large research datasets between academic sites which subscribe.

For ssh, there's [https://www.psc.edu/index.php/hpn-
ssh](https://www.psc.edu/index.php/hpn-ssh) for improving wide-area transfer
speeds, and something else, I think.

------
Groxx
Since 5GB would take quite a long time to upload on DSL: almost universally
dropbox, simply because it resumes + it's accessible to people with a url. If
the recipient is ssh-friendly and there's a shared machine we can both access,
rsync is quite a lot faster and more controllable and doesn't make my CPU
angry for hours.

Otherwise I liked Firefox Send while it was running since I mostly trust them
to not be full of intrusive nonsense.

------
cmckn
I run FileBrowser [1] to share media with friends. I would suggest trying IPFS
if you don't want to forward a port from the internet. You'll get similar
download performance to FileBrowser once your node integrates with the network
(this takes 30 minutes or so). Check it out!

[1]:
[https://github.com/filebrowser/filebrowser](https://github.com/filebrowser/filebrowser)

------
WarOnPrivacy
I and my customers all have symmetric FTTH. When I'm copying a file from me to
me (working remotely), I use a VPN. These are usually .vhdx files, ranging
from 6GB to 300GB. I copy them to here for troubleshooting, then I send them
back.

When I want to make something large available to someone else, I post it on my
local webserver. It points to an NAS on my LAN; anyone in the house can just
drop whatever on it and hand out a link.

------
codethief
Syncthing works pretty well in my experience, even across NAT. I recently
shared 10GB worth of photos with a family member and ~30min later it was done.

------
danbmil99
Worth noting that there's an expensive but very popular product from IBM
called Aspera

I've never used it myself but I've heard but it's actually quite impressive
but the business model makes it inaccessible to individuals

Apparently they do some sort of complicated Trace routing and open a ton of
simultaneous routes. Probably also uses something other than TCP 4 correction.

Seems like a good place for some disruption

------
talove
\- Airdrop for immediate stuff. Gets dicey around 30-50GB tho.

\- iCloud drive for personal stuff I need to share between devices. I trust
this to sync anything up to a TB reliably.

\- Google Drive when I need to share to someone else.

\- Occasionally external drives when I need to move data fast locally.

\- Some combination of S3 / AWS CLI / EC2 when things go beyond personal
computer capacity depending on where the data is coming from and going to.

------
banana_giraffe
For things smaller than around 10gb, I tend to use S3. Either with pre-signed
links behind a little bespoke portal that people can log into and see what
files I've "shared" with them, or just direct S3 links for those that what to
automate it all.

For bigger things, there are two basic paths:

If it's to someone that's not primary a tech person, then the data goes on a
portable hard drive, and I either drive it to them, or mail it and walk them
through accessing the data. In both cases I encrypt the data, generally with
Veracrypt.

If it's to someone that won't mind using AWS's toolchain, and has decent
connectivity, I'll use Snowcone or Snowball to get the data to S3 and give
them S3 URLs.

I tend to get the data to S3 sooner or later so I can migrate it to Glacier.
More than once I've had to recover GBs or TBs of data because the customer
lost it, so I'm prepared now.

------
skmurphy
I have used Hightail for years, from when it was called YouSendIt. Here is
info from their website on plans and pricing

[https://www.hightail.com/file-sharing](https://www.hightail.com/file-sharing)

Hightail offers four different subscription levels for file sharing, with
varying restrictions on file upload limits and storage. With our free Lite
plan, you can upload files up to 100MB each, and have a storage limit of 2GB.
Pro plans, which start at $15/month, allow you to upload individual files up
to 25GB each. Teams plans, which start at $30/user/month, allow you to upload
individual files up to 50GB each. Business plans, which start at
$45/user/month, allow you to upload individual files up to 500GB each. Pro,
Teams and Business plans come with unlimited storage.

------
hprotagonist
magic-wormhole: [https://magic-
wormhole.readthedocs.io/en/latest/](https://magic-
wormhole.readthedocs.io/en/latest/)

    
    
      pipx install magic-wormhole
      wormhole send /some/big/stupid/thing.tar.gz

~~~
anaganisk
Also install microsoft visual c++ build tools which is like 1.2GB :(

------
Qision
Someone proposed WeTransfer but there is also Tresorit Send
([https://send.tresorit.com/](https://send.tresorit.com/)) who does the exact
same job. Bonus point: they say they encrypt the data and their servers are
hosted in Switzerland.

------
beginrescueend
Inside a company, I like transfer.sh, which is like an open source version of
file.io - *
[https://github.com/dutchcoders/transfer.sh](https://github.com/dutchcoders/transfer.sh)

That's good for the 5G +/\- file transfer.

------
kn100
For files less than 1GB I tend to use Telegram. It obviously has the downside
of uploading to server X and then downloading from server X, but usually if I
am sending a file to one person, I'll likely be sending it to others too, and
therefore the ability to forward the file to others arbitrarily after the fact
actually proves to be a pretty useful quality. If I care about the datas
security, an encrypted 7z container or something will do.

For files I need more control over that are less than say 5gb, I tend to scp
them to a web server I control, so that I can delete them afterwards.

For files larger than that, I'll use a private Bittorrent file. It's very rare
I need to transfer files this large, but I really like this solution.

------
zepearl
Personally I would use my own Nextcloud instance for up to 20-30GBs. Not sure
about TBs.

What about using "Firefox Send"? (I never used it so far)

[https://support.mozilla.org/en-US/kb/send-files-anyone-
secur...](https://support.mozilla.org/en-US/kb/send-files-anyone-securely-
firefox-send)

I read that the limit is 1-2.5GBs => maybe you could break down the file and
upload it in multiple pieces... .

EDIT: oopps, Firefox Send doesn seem to be available anymore -
[https://support.mozilla.org/en-US/kb/what-happened-
firefox-s...](https://support.mozilla.org/en-US/kb/what-happened-firefox-send)

~~~
Apreche
People were using Firefox Send to "send" malware and other bad things to
people. That's what happens if you let anyone anonymously host files on your
domain for free.

~~~
butz
And how other platforms, like wetransfer, are dealing with that?

------
retouchup
There are many ways of sending big files over the Internet, so I will go into
the top three, save time. The first one is the shortest, when using Gmail.
Click on the drive button instead, rather than just the paper clip button you
will be using with regular attachments. It provides an outline of your drive
files, from which you can pick the file to import. Just as easy.

You should go to another cloud computing provider if you don't like Google
Drive.

My third alternative is even more easy, but it's a little sluggish because it
cuts you at two gigs either way. It is a service called WeTransfer, which was
almost your only choice 15 years ago if you had to submit a huge file.

------
makeworld
I like gofile.io, as it's private and has no limits.

But for something like 10 TB or more, I'd see a torrent as the only way. My
uploads speeds are too slow for anything else, the connection would be reset.
The torrent also helps prevent corruption.

------
robert_foss
Magic wormhole - it traverses NATs, is encrypted, requires configuration of
the source or destination.

[https://github.com/warner/magic-wormhole](https://github.com/warner/magic-
wormhole)

~~~
kvn_95
Do you mean requires _no_ configuration on the source or destination? :)

~~~
anaganisk
Installing it on windows required me to install visual c++ build tools, which
is a beast of 1.2GB. So yeah it requires a lot. I rather recommend croc, just
dropped it on my path it Just works.

~~~
psanford
[https://github.com/psanford/wormhole-
william/releases/](https://github.com/psanford/wormhole-william/releases/)

------
nikeee
When I'm the sender, I use an http server in the current directory using http-
Server (node) or http.server (python 3).

When I'm the receiver, I'm using a self-written CLI tool "send me a file"
(smaf): [https://github.com/nikeee/send-me-a-
file](https://github.com/nikeee/send-me-a-file)

It basically opens up an http server on my machine and let's a user upload a
file via curl or a web interface. It also computes a sha256 hash while
receiving.

These methods only work on a local network. I use my VPS and sftp/http for
transfers descending the local network.

------
renewiltord
S3. Sometimes I do that even when I'm transferring on my local network just
because I know the flow so well. After all, my speed to the Internet is
roughly the same as the speed on my LAN. They're both gigabit duplex.

~~~
_yhdy
You config IAM perms every time?

~~~
renewiltord
I actually just use
[https://docs.aws.amazon.com/cli/latest/reference/s3/presign....](https://docs.aws.amazon.com/cli/latest/reference/s3/presign.html)

Then I copy-paste that into a self-Slack and `wget` it on the other side.

It's almost equivalent ergonomically to a `python3 -m http.server` and because
it's the same whether I'm giving to someone else or to myself, I spend less
thinking!

------
ryankrage77
For anything 5GB plus - external drive & cycle over

The only person I'm sharing with lives near me, so sneakernet is the most
convenient.

I once hit a transfer speed of 30TBs/hour carrying a box of hard drives home
from work.

~~~
Natfan
"Never underestimate the bandwidth of a station wagon full of tapes hurtling
down the highway."

– Andrew Tanenbaum, 1981

------
dheera
I used to do the HTTP server approach when I was a student at MIT and had a
static IPv4 and symmetric gigabit in my dorm room and therefore the recipient
could download at usually their full downlink speed.

Now I live in the heart of Silicon Valley, a couple km from Google's
Headquarters, and have a crappy 15 mbps uplink with no other options available
to me, so I typically throw the file on Amazon S3, Google Drive, or Dropbox
before sending a link to the other person so that they don't have to put up
with a 15 mbps download.

------
usmannk
For local networks I use the builtin python http server `python3 -m
http.server` (or `python -m SimpleHTTPServer` in 2.x). By default it binds to
all so anyone on your network can access it.

~~~
tandav
When send large files I often get broken pipe error using this method

------
onelastjob
For transfers on the smaller side, I'd have a look at
[https://massive.io/](https://massive.io/) I've done a fair bit of looking
into options for sending files online and I like their pricing model a lot
because it is purely usage based and has no caps. It's $0.25/GB for downloads.

For larger transfers, I'd look at File Catalyst Spaces. If you have a bigger
budget, you could look at IBM Aspera or Media Shuttle.

------
m0xte
>10Gb - Stick it on a bitlocker encrypted USB stick and chuck it in the post.
Next day download if you send it 1st class here.

<10Gb - chuck it on S3 and send a link to the other person.

------
esaym
Usually a local HTTP server. But I've wondered before if binary data could be
encoded in to a video file and then uploaded to youtube or some other video
service...

~~~
RealStickman_
Any compression would ruin that.

Though it should be possible to do what you described with 16MP pictures and
upload them to google drive. Those won't be compressed according to someone
under that LTT video where they abuse 5 business accounts for their backup.

~~~
esaym
>Any compression would ruin that.

Well not really. You could have an entire frame as a solid color red to
represent a '1' and a solid white frame to represent a '0'. At 24 fps, that is
3 bytes per second and would certainly survive compression. Just keep drilling
down, half a frame that is red and another half that is white would represent
a '10' pair, now you've doubled your throughput. You could keep cutting these
blocks down until you get to a limit, you certainly couldn't get down to a
single pixel, but some factor of that. Plus you could use an encoding with
error correction instead of just raw binary data.

~~~
ta95846893
Sounds like you'll end up with a couple of qr codes per frame in the end,
which would probably survive compression

------
Sean-Der
[https://webwormhole.io/](https://webwormhole.io/) has web and native clients.
You can build for Windows/Mac/Linux/FreeBSD/Mobile. It uses WebRTC so if you
are in the same network it will establish the best path possible.

I see a lot of people mentioning magic-wormhole and NAT traversal. I can't
find any docs that confirm this. I think it always runs through a relay
server?

------
Seb-C
I have been using mega.nz to synchronize and backup my files for years, and so
far I am very happy with it.

Everything works as expected, there is a lot of storage space and end-to-end
encryption. There are clients for all platforms and it just works.

I can easily create secure share links whenever I want.

The only downsides are that the browser-based app is very slow to start once
you have lots of files, and the android client does not have the same
synchronization capabilities.

------
kenneth
In the enterprise space, there is dedicated software for this function. It
uses an UDP protocol designed directly for efficiency (less overhead, vs.
TCP), and builds in resilience such that it'll work over connections with lots
of packet drops, or connections with high latency or low reliability.

[https://gojetstream.io/](https://gojetstream.io/)

~~~
mongol
Interesting. Is this implemented similiar to QUIC as some kind of connection
oriented protocol on top of UDP?

------
guar47
I am actually transferring 400Gb of data across the world right now. I decided
to use Apple iCloud and it works pretty good.

Too bad the laptop I am transferring from in a bad condition and with Windows
so sometimes iCloud service is freezing and I need to restart it.

I tried to create torrent but it seems it takes forever on such a weak
machine.

------
raverbashing
I think the question for 10TB is more complicated than it seems at first

Is it one 10TB file? Multiple files? How do you handle file integrity? Error
correction?

------
eFishCent
I have used linux's rsync to send up to 100GiB at a time if the connection is
reasonably stable. There are a couple challenges with this method: 1) the way
it resumes takes a while to recheck the file before staring again; the larger
the file, the longer this takes 2) you need to know a bit about writing a
simple script to loop the rsync until the transfer is completed.

------
Fornax96
If it's a file less than 10 GB I will use my own website:
[https://pixeldrain.com](https://pixeldrain.com). For things slightly larger I
will split it up using 7zip and use pixeldrain too.

For much larger files I always fall back to BitTorrent. I have also kept an
eye on WebWormhole, but haven't been in a situation yet where I needed it.

------
Ace__
Large files, and I mean from hundreds of GB to terrabytes I use
[https://www.filemail.com/](https://www.filemail.com/). Their desktop app uses
UDP, so it is a lot faster than TCP based protocols like FTP and HTTP. Yeah
you can still use their website to send stuff, and if small in size, free
anyway for up to 50 GB.

~~~
stantechman
UDP really makes the difference in speed! <3

------
natch
There was some tool for doing this securely on the command line. You run it
with the file name and it gives you back a token. You give the token to your
friend, they run the same tool with the token, and it downloads the file.
Can’t for the life of me find it or remember what it was called. Wormhole?

Edit: I guess it was magic wormhole, discussed elsewhere in this thread.

------
ohthehugemanate
I have a home nextcloud with a few extra TB of storage. I would put it there
and use a share link.

But HTTP is not a great protocol for really large transfers like 10TB. Ideally
you'd want something that parallelizes and does hash checks.

Even then, at 10TB you need a 2 gigabit connection to be even competitive with
SSD-and-overnight-shipping (about 12hrs).

------
egypturnash
Apple Mail has a 30-day storage zone for large email attachments that it
automatically offers when you attach big stuff, and I usually use that.

When I don't I usually just zip it and upload it to the media section of my
Wordpress-managed website.

I've never generated files in the terabyte range, so I don't worry about that.

------
manexploitsman
Have a look at the "File Sharing and Synchronization" section
[https://github.com/awesome-selfhosted/awesome-
selfhosted#fil...](https://github.com/awesome-selfhosted/awesome-
selfhosted#file-sharing-and-synchronization)

------
parliament32
I self-host a Nextcloud instance. If it's too big for a browser download,
Bittorrent is the way to go.

~~~
anderspitman
This is interesting. What conditions would make a file too big for a browser
download? If the server supports range requests (I'm almost certain Nextcloud
does), then the browser can download in chunks and just retry any failed
chunks.

~~~
parliament32
It's technically possible, but most browsers don't auto-resume, and most
webservers will timeout a request after a certain amount of time has elapsed.
You can get around all this with browser extensions, but it's honestly easier
to use a more robust protocol (this is in the context of, say, a 10TB file
that'll take you a few days to download).

~~~
anderspitman
I don't see how anything would be much more robust than HTTP. Any network
based protocol can fail at any time, so you're going to have to have some
concept of tracking your progress and retrying failed requests. Both are easy
to do with HTTP. Can you give a concrete example of something that would work
better than HTTP?

~~~
parliament32
Here's an exercise for you:

In your browser, start downloading a test file, like [1]. Turn off your
network card for a few seconds, then turn it back on. See what your browser
does.

Next, start downloading a large file over Bittorrent, like [2]. Turn off your
network card for a few seconds, then turn it back on. See what the client
does.

[1]
[http://speedtest.tele2.net/100GB.zip](http://speedtest.tele2.net/100GB.zip)

[2] [https://cdimage.debian.org/debian-cd/current/amd64/bt-
dvd/de...](https://cdimage.debian.org/debian-cd/current/amd64/bt-
dvd/debian-10.5.0-amd64-DVD-1.iso.torrent)

~~~
anderspitman
This has nothing to do with HTTP. It's just poor behavior on the part of the
browser.

------
laluser
Dropbox transfer supports up to 250GB files now. That's the way to go for most
of my needs.

------
Faaak
Plain and simple: I use swisstransfer.com which is free (50Gb) and swiss-
based.

I did it like you "in the past" (on my own fqdn / public-data) but I've given
up on maintaining my website.

A torrent works really good if you've got multi-Tb files and many recipients.

------
jerven
For 500Gb or more, aspera soft/ascp. Used by quite a few large bioinformatics
institutes for allowing data uploads.

[1] [https://www.ibm.com/products/aspera](https://www.ibm.com/products/aspera)

------
hotwire
A station wagon full of tapes hurtling down the highway...

~~~
auxym
No one makes station wagons anymore, except luxury european makers :(

~~~
myself248
Aside from the already-mentioned Subaru Outback, there's the Buick Regal
TourX, the Mini Cooper Clubman, and I'll contend that the Dodge Journey is
more stationwagon than SUV; it's on the Avenger sedan platform.

------
heavyset_go
Physical media for anything over a few gigs. It's just easier, especially if
the recipient isn't technologically inclined. A lot of the media industry
ships hard drives around to one another because content can be so massive.

------
m463
"Never underestimate the bandwidth of a 747 fully loaded with backup tapes"

------
justinweiss
I was asking myself exactly this a few days ago, to share ~500mb worth of
files with a few people on a message board.
[https://gofile.io/](https://gofile.io/) worked fine.

------
hosteur
I use [http://ifile.dk](http://ifile.dk)

Simple. Works.

------
gpm
5G from a kubernetes pod: netcat

Send help... no seriously... how do I upload a file from a server I have a
shell on but can't easily open ports to or install software on (and where the
built in cp tool fails on big files...)?

~~~
anderspitman
One of the main uses I made [https://patchbay.pub](https://patchbay.pub) for
was transferring files to servers/VMs where I don't have easy rsync access.
You just need curl or another HTTP client installed.

------
Arkanosis
If the network bandwidth on both end allows for a transfer in less than 48
hours, most likely rsync with the --partial-dir=.rsync-partial flag. Otherwise
SD card / USB key / hard drive in the mail.

------
noxer
Telegram. Up to 2GB per file so just 7zip and split the file(s).

Never needed to send anyone 10TB nor do I have that much storage but if you do
you probably have some kind of NAS/server where you can enable FTP access.

~~~
gvjddbnvdrbv
10TB equals mail a HDD.

------
AnIdiotOnTheNet
Locally: SMB. Over the internet: SendGB. Ridiculously sized: HDDs in a car.

------
ha-ckernews
Fram's Fast File Exchange ([https://fex.rus.uni-
stuttgart.de/](https://fex.rus.uni-stuttgart.de/)) works well.

Free/open source

------
wheels
I have Syncthing syncing to a "/shared" directory on my private web server. If
the file is under the 20-ish GB I have free there, that's good enough.

------
daniel_iversen
Dropbox was literally built for that sort of thing! And with 5GB you can
probably even do it with the free account by getting yourself a tiny bit more
bonus space.

------
thedanbob
I send public links from the Seafile instance running on my home server. For
10+ TB, I think the only practical option would be to mail a hard drive or
two.

------
askvictor
Given the quantity of data sent from the LHC to universities across the
Atlantic, I'm curious how they (CERN et al) handle their data transfers.

~~~
jerven
Mostly gridftp
[https://en.wikipedia.org/wiki/GridFTP](https://en.wikipedia.org/wiki/GridFTP)

------
rootsudo
Onedrive/sharrepoint online.

Technically, unlimited storage on OneDrive, can force the recipient to verify
2fa via their email address, force expiration link, etc.

------
t312227
as a game of thrones fan:

a raven with a sufficient large micro-sd card ;)

------
carlreid
I use justbeamit if it's a direct send to someone.

[https://justbeamit.com/](https://justbeamit.com/)

------
known
Interesting information; Thanks to all commentators;

------
nakodari
I am running Jumpshare ([https://jumpshare.com](https://jumpshare.com)) and
our users regularly send large files anywhere from 2GB to 100GB and in some
cases even more. My recommendation is to use cloud providers for sharing files
that are maximum 50GB in size. Please note that although you can share even
bigger files you have to consider the reliability of the internet provider of
both the sender and receiver. For files bigger than 50GB in size, I would
recommend a P2P solution.

------
ikeboy
I like mega, it's encrypted, there's command line tools (megatools), it's very
fast to upload from a server in my experience.

~~~
jokethrowaway
Not to mention the historical value:
[https://youtube.com/watch?v=o0Wvn-9BXVc](https://youtube.com/watch?v=o0Wvn-9BXVc)

------
kyleee
Firefox send works great for this sort of thing though there is an upper limit
on file size (can't recall what it is right now)

------
werber
Maybe I’m old, but physically. Most of my large transfers are photographs,
video and audio recordings and I like flash drives.

------
emerged
upload to s3 bucket, make public, send link

------
Apreche
Upload to cloud storage then have the other party download it.

If it's too big for cloud storage, ship a hard drive in the mail.

------
vbezhenar
Up to a few GBs I'll host them on my server and send a link. More than that:
I'll host a torrent.

------
reportgunner
In person. If it's not possible to share in person I point them to the place
where I myself got it.

------
pcvarmint
USPS Priority Mail.

You can send a 10TB hard drive faster by USPS than it would take to transmit
across the internet.

------
maliker
[https://transfer.pcloud.com](https://transfer.pcloud.com)

------
sumnole
OneDrive has served well enough for my purposes. I've only uploaded files
smaller than 1tb though.

------
searchableguy
I have a raspberry pi with a hard drive. I throw the files in that and then
use dat/ipfs.

------
thehappypm
Google Drive! Send a public link.

------
anonymoushn
I upload it to my VPS and send someone a link.

I think google drive is a decent solution as well.

------
holychiz
for non-technical folks to send me big files (5GB limit), I send them to
[https://transfer.pcloud.com/](https://transfer.pcloud.com/). no registration
needed.

------
simonw
I use Transmit on macOS to upload to an S3 bucket and then send them a link.

------
notatoad
if i think the recipient is competent enough, i'll ask for or provide ssh
access and copy it with rsync

otherwise, i've mailed a flash drive before, and for <20GB i'll just put it on
a public S3 bucket.

------
really3452
Uploading to Sia's Skynet, then emailing or texting the link.

~~~
Covalent
Or you can use [https://defy.chat](https://defy.chat)

It's still under development, but it's working towards being a decentralized
discord alternative.

------
Foivos
At 10 TB I guess your best option is to physically mail an HDD.

------
simon_acca
Google cloud storage. Ui is actually nice, permissions are flexible and has
both a robust web uploader as well as a cli one. Not to mention it’s cheap and
only billed by the GB/hour or something.

------
rawoke083600
scp to one one of my unrelated websites and send http-link. Remove file
afterwards... #notperfect.

------
badrabbit
ngrok.io and similar reverse tunnels

------
aaron695
Free unused Google Drive

Some times I'll have to open the file in a hex editor and change the hash if
it sets off copyright restrictions.

This is a 10 gig limit.

------
bethecloud
I like to use transfer.sh

------
mcculley
Keybase

------
summm
Retroshare

~~~
Qision
I didn't know this software had users...

------
tobyhinloopen
Either via Google Drive or iCloud Drive. If it doesn’t fit on these, use
physical media

------
villgax
Backblaze/Torrent

------
cellularmitosis
socat on both ends :)

------
alfg
S3 and signed URLs.

------
11235813213455
streaming json-lines (for large json array files)

------
arthurcolle
Magic Wormhole

------
zacksinclair
Git LFS ;)

------
whalesalad
S3

------
ohazi
rsync over ssh with retries

------
ghthor
IPFS

------
phendrenad2
S3

------
cheeze
python -m http.server

------
rocky1138
scp

------
jeffbee
Google Drive seems a lot more practical than pretty much every other answer in
this thread.

