
Ask HN: Fastest way to tranfer large files? - imagetic
It&#x27;s 2019 and I have fiber in two locations with computers that I need to transfer large files between quickly. What&#x27;s the fastest way to do that?<p>G-Drive has always capped me around 15mb&#x2F;s and then it still has to be downloaded.  Bittorrent Sync &#x2F; Resilio doesn&#x27;t want to go above 500kbps.
======
Nextgrid
SFTP, although you might be limited by the encryption’s performance. If
authentication & confidentiality aren’t needed you can downgrade to a quicker
(& less secure) cipher such as _arcfour_.

------
mindcrime
Possibly GridFTP[1], UDT[2], or Tsunami[3]? Also consider the old adage:
"Nothing beats the bandwidth of a station wagon full of tapes hurtling down
the freeway."

[1]:
[https://en.wikipedia.org/wiki/GridFTP](https://en.wikipedia.org/wiki/GridFTP)

[2]: [https://en.wikipedia.org/wiki/UDP-
based_Data_Transfer_Protoc...](https://en.wikipedia.org/wiki/UDP-
based_Data_Transfer_Protocol)

[3]:
[https://en.wikipedia.org/wiki/Tsunami_UDP_Protocol](https://en.wikipedia.org/wiki/Tsunami_UDP_Protocol)

~~~
imagetic
And today we drove hard drives to save 6 hours!

------
Adamantcheese
Wouldn't you be speed limited by the lowest bandwidth equipment in the mix?
Unless it's fiber (or equivalent bandwidth) from end to end you're not going
to be able to go faster than what the slowest machine will process. If your
outbound bandwidth is limited, you may want to compress files locally first,
to save on transfer time. That being said, 64-bit scp will probably work fine?
If there's no restrictions, making it a torrent would probably work as well,
if you need to transfer them more passively.

~~~
imagetic
In this situation both points are on consumer fiber (AT&T). A 28 GB file takes
roughly 1 hour to upload to G-Drive and 1-2 hours to download on the other
side. Speeds caps out at 14 MB/s using most services like G-Drive, Dropbox,
etc.

Multipart/chunking with S3 / Google Storage may yield faster results.

Bittorrent/Resilio Sync is extremely slow, sub 1MB/s.

------
shpx

        rsync -P -e ssh <username>@<hostname>:/file/you/want/to/copy.txt ~/wherever/you/want/to/put/it
    

"-P" is to report progress and to not delete files that haven't been
transferred completely (in case the connection drops out) and "-e ssh" is to
transfer files over SSH.

~~~
imagetic
rsync is extremely slow in my experience, and from client to server a great
tool, complex to go from client to client machine.

~~~
avichalp
Yes, I second that. I usually need to download not so big (~1 GB) files from
server. I am also looking for a faster way.

------
lastofus
If you are serious about saturating your pipe as a business with some cash to
spend, there is Aspera. My understanding is that they build their own protocol
on top of UDP to avoid pesky behavior of TCP.

~~~
imagetic
Aspera I have used for bigger TV / production work. The bill breaks the
budgets for my freelance work.

------
_shadi
if you can open any port on the receiver computer then I would go with nc.

Receiver: nc -q 1 -l -p $PORT | tar xv Sender: tar cv . | nc -q 1 $IP_ADDRESS
$PORT

------
elamje
[https://send.firefox.com](https://send.firefox.com) seems good

~~~
imagetic
We're trying to send 28GB files though.

------
golem14
You can copy chunks of the large files in parallel. Entire doable but needs
thorough testing.

~~~
golem14
Actually, [https://www.sanger.ac.uk/science/tools/pcp-parallel-
copy](https://www.sanger.ac.uk/science/tools/pcp-parallel-copy) seems to have
an implementation- have not tried it myself.

------
imagetic
tranfer? I would have a typo in the title.

------
rolfeb
Pretend it is 1990...

FTP.

