
FTP Must die the technical explanation - aurelien
http://mywiki.wooledge.org/FtpMustDie
======
nextweek2
Standards documents are created by people with a need. They then implement the
feature and seek feedback, hence the term RFC.

Ideally rather than being negative, how about discussing how it could be
improved in the form of an RFC document.

Also the section on comparing an FTP session to a HTTP GET is just misleading.
One is an authenticated long running session where as the other is just a
fetch with a connection left hanging (resources on the server are tied up).

------
Avernar
I loved the active/passive mode feature of FTP back before getting broadband.
I had an FTP client that would put one server in active mode and one server in
passive mode and transfer files directly between them with only status
messages going down my 9600 baud connection.

------
restalis
_" If security of your authentication credentials matters one iota to you,
you'll use scp to transfer files, not FTP."_

If security maters then use a secure version of the FTP (SFTP or FTPS). FTP
has a greater multi-platform support compared to SCP.

------
k__
I never had to use FTP at work in my career. Somehow I always got an SSH
login.

FTP always was more of a private thing for me, like BitTorrent.

------
rwmj
As the author of an FTP server called Net::FTPServer, I must say this article
is full of crap. All modern clients request binary mode and passive
connections by default. All modern servers can restrict the range of port
numbers used on the server side. Clients support encrypted control and/or data
connections. You can checksum files using de facto standard commands (it's
better than unencrypted http in this regard).

Now some benefits of FTP:

\- easy uploading of files

\- standardized file listings, delete, mkdir, etc

\- extensible protocol via "SITE" commands

\- wide availability of clients and servers

It's better to compare FTP to WebDAV. After comparing it you may still think
FTP should "die" (whatever that means in the open internet where any two
people can run any service they want), but at least you'd be making a sound
technical decision, which you're not doing by reading this article.

~~~
masklinn
> \- standardized file listings, delete, mkdir, etc

Isn't one of the rant's point specifically that these are _not_ standardised
(in a de-jure sense)?

> It's better to compare FTP to WebDAV.

SFTP exists, has none of FTP or WebDAV's issues and (as far as I can tell)
ticks all your boxes. Even extensions: [http://tools.ietf.org/html/draft-ietf-
secsh-filexfer-13#sect...](http://tools.ietf.org/html/draft-ietf-secsh-
filexfer-13#section-10)

~~~
rwmj
Those commands really are standardized. The problem is that the rant didn't
bother to look at any of the more modern RFCs, nor at de facto standards
supported across lots of servers and clients.

SFTP is fine too, assuming you have support, which is not available on many
embedded platforms, and even on Windows requires downloading extra tools.

I'm hardly claiming that FTP is the pinnacle of human achievement. The
original standards have many obvious flaws, but those have long been fixed.
What we have now is maybe not elegant, but it works and the problems with it
are _not_ any of the things outlined in the article.

------
jezfromfuture
Ftp is still the fastest file transfer protocol , when you invent something
better then you can moan otherwise plz go to fail.

~~~
kba
Everything unencrypted will always be faster than the encrypted counterpart.
Do you also deliberately avoid HTTPS? And is that minuscule performance gain
really worth the risk with FTP?

~~~
cm2187
Not convinced https is a simple substitute for ftp. Think of the new asp.net
MVC framework. The http root directory is not the root directory of the MVC
app, but a sub directory (which by the way is a good design decision,
separating executable from static content). So with https you will only have
access to a subdirectory of the application you need to deploy. Not sure you
will be able to use https to deploy.

ftpes sftp and ftps are alternative solutions but I found them to be extremely
incompatible between softwares. I never managed to get filezilla ftp client to
connect to a secure IIS ftp server. And Visual Studio can't deploy to secure
ftp, etc.

~~~
kba
I did not mean to say that HTTPS was a substitute for FTP. You said that FTP
was the fastest protocol, and the only reasonable explanation for why FTP
could be faster than anything else would be due to the lack of encryption.

Therefore, I asked if you also avoided HTTPS (and only used HTTP), since HTTP
clearly is faster than HTTPS.

But if you didn't mean to say that FTP is the fastest due to the lack of
encryption, then why do you think FTP is the fastest?

------
INTPenis
I recently had a client order SFTP from me, I thought I had misread and asked
if they meant FTPS (FTP with SSL) but they insisted that they wanted SFTP.

This was a case where the client was quite informed and knew that they wanted
encryption but that it would never work through a firewall so they ordered
SFTP.

I was pleased about this but then came other challenges, MITM protection.

My solution was to use DNSSEC with SSHFP records. Unfortunately I can't
guarantee that the client software supports it.

~~~
kba
Why would you assume they wanted FTPS? SFTP is both more common and very
superior. Also, what MITM attacks are you afraid of with SFTP?

~~~
creshal
> Also, what MITM attacks are you afraid of with SFTP?

SFTP, like SSH, is "trust on first use". If you don't have some out-of-band
mechanism in place to verify the server fingerprint, you're going to have a
bad time.

While the CA system isn't perfect, rolling out your own CA to clients is
easily automated and verifying certificates from that point on happens
automatically.

~~~
anc84
If a client orders SFTP providing off-band fingerprints is a trivial step.

~~~
creshal
If that client is a single person, sure. If the client is a 500 person
organization where half need access to the server, I can see why people would
prefer FTPS with a certificate from their internal CA.

~~~
frutiger
You can do SSH certificates that can sign and revoke keys. For some reason,
most people assume TLS when you mention the word certificate. Read the
CERTIFICATES section in ssh-keygen(1).

~~~
creshal
True, but you still need to build your own infrastructure to roll out the CA,
vs. X.509, which is implicitly handled by all OSes.

------
fenesiistvan
Why we should kill all well know old technologies? FTP should remain forever
and if somebody wish to use it lets allow it. I don't understand all these
kind of hype for new technologies. Are you kidding about Dropbox? Drop an old,
well know tech by a fancy new protocol with vendor lock-in? Most of the issues
listed here are already fixed in modern clients. I am using FTP every day on
local LAN and also on internet. Never had any issue with connectivity, or
firewalls (this is why passive mode was added) or file corruption. Not to talk
about the ascii mode problem (That was a problem only with ancient unix
clients).

~~~
feld
FTP is 40 years old and was designed for an internet where there was no NAT.
Let it die already.

~~~
rakoo
TCP is 40 years old and was designed for a world where there was no
smartphones. Let it die already.

~~~
Tiksi
I've talked to people that earnestly hold this view and want everything to be
on top of udp.

~~~
rakoo
Well, there _is_ QUIC and Mosh seems to be working quite well so it is true
that UDP may make more sense in certain use cases.

~~~
Tiksi
Yeah there are definitely use cases where UDP makes sense and they're far from
rare, but that doesn't mean TCP should be thrown out since there are still
many cases where a reliable connection based protocol is better.

~~~
feld
TCP doesn't really provide the reliability that people assume it does. UDP is
often used because it is expected that the connection will be unreliable.

------
Piskvorrr
To be honest, I'm surprised that anyone still uses _that thing_ in 2016 - we
have rsync, bittorrent, and all the HTTP we can eat. What compels _anyone_ to
think "I know, let's use the most broken tool I could possibly find"?

~~~
moontear
Hmm, the rant is just that - a rant. No solutions and many non-arguments such
as outdated acronyms in the RFC.

I am with you that there are other solutions, but they are not easy to use for
enterprise customers and that is where FTP still thrives. Rsync? Sysadmins
maybe know this, but your marketing guy? BitTorrent? That's for illegal stuff
and it is blocked on our company network.

You want to send a client (e.g. a newspaper) all your fancy videos and
graphics (remember you are the marketing guy) to publish somewhere. The videos
are of course 4GB per file.

What is the easiest way to share these files you have sitting in your
"Marketing Campaign 2016" folder? Ask your IT department (just going by your
suggestions): \- HTTP? Our server cannot handle file transfers larger than 2GB
and times out after that. Also most clients don't have resuming capabilities.
And don't even start with public file sharers our data is private. \-
BitTorrent? Is blocked on our network. Also the client told us it is blocked
by their ISP \- rsync? What is that? I'm a windows guy. Also I doubt that the
newspaper will know what to do with some weird command line commands...

Just set up a quick FTP server (or use the existing one) and send the link to
the client. Everything works. Resuming. Large downloads. Easy to use.
Reliable. "Known" interface as the client just sees folders and files. Many
clients available or just the web browser.

I am not surprised at all that FTP is not dying, I myself have often advised
customers to "just use FTP" instead of whipping up a custom solution. It just
works. The cost is low, the benefit high.

Sadly there currently is no run of the mill solution to replace FTP that "just
works" for the average joe.

~~~
kalleboo
You're transferring private data over an unencrypted protocol that sends
passwords in plaintext?

SFTP does everything FTP does, works in every FTP client I've seen this
decade, and is even supported by some FTP servers if you need virtual users.

~~~
creshal
Virtually every current FTP client also supports FTPS, so security is good
enough.

Meanwhile, SFTP does _not_ work with browsers and file managers (unlike
FTP[S]), so FTPS works "out of the box" unlike SFTP.

------
jamescun
Is FTP _that_ heavily used in 2016? The only use I can see for it now that
hasn't been replaced by either a better protocol (i.e. BitTorrent) or a
service (i.e. Dropbox) is shared web hosting, even then many now support SFTP
(and git).

~~~
leejo
Can confirm that as recently as 2012 most major banks, certainly in the UK,
were using FTP for transferring their overnight batch payment files (some even
over PSTN). Some have managed to move to SFTP (and FTPS), but i would say it's
still heavily used in banking/payments.

~~~
clappski
We can't use SFTP internally because of policies around encryption, so a lot
of our stuff uses FTP to move data around.

~~~
BillinghamJ
uh, is the policy that encryption isn't allowed? SFTP is encrypted, FTP isn't.

~~~
w8rbt
Many security groups in corporations inspect network traffic for malicious
activity. Encryption prevents this, so policies or technical measures may be
introduced to ban or limit its use.

As a side note, several years ago, I had a friend who worked for a small
research company that had a ban on encryption software. Installing or using
PGP on a company computer would be cause for termination.

I expect to see these ideas become more mainstream. In future, companies may
only allow certain encryption technologies on their networks/nodes (ones which
they hold keys for).

