README: No such file or directory.
[ edit after replies ]
Agreed, perhaps i unrightfully assumed people now what 'encrypted' means. Growing up with computers i don't know any better and sometimes forget or can't properly imagine what layman know and don't know.
They're not authenticated as anything else.
Sure "Not Encrypted" is more accurate, but not everybody is going to know what that means.
FTP is still very useful, and is depended upon by tons of corporations for bulk transfers and sensitive transactions. It is trivial to add SSL to FTP, or use SFTP if your clients/servers support it.
To send someone a file, I send them a link that they can easily use to download from the FTP site.
It also allows them to upload stuff to it too.
has the ability to develop and work on technology dropped this low? this is a regression.
So yes I'm "peddling" a free web service over a DIY approach, especially as securing a public server isn't quite as trivial as you make it sound. It does take knowledge and sustained effort.
It's especially mediocre when you want to download/upload multiple files like a directory with many files inside of it.
A friend recently shared with me his file collection, through OwnCloud. It was such a pain to download. Each file is a separate link. I can't use wget to pull it, because of authentication. It had some cli tools, but it was buggy and couldn't make it work. And of course no ftp. Essentially it forced me to click every single file. I couldn't even verify the files downloaded correctly.
FTP is actually great for what it does and with clients like lftp it's fairly enjoyable to use. If you want to kill it, better make a worthy alternative.
The two port system allows a separate, text based control protocol with the ability to transfer files between two remote systems (FXP). These features don't really matter anymore and the second one is a security nightmare.
A better alternative would be SFTP, which is based on SSH and is pre-installed on most UNIX-like system. The only problem is that web browsers don't support it natively.
What are you talking about? Wget supports --user and --password options.
Okay want to just FTP the files over?
And every attempt at doing key management better is met with hostility from many in the community (rightfully or otherwise) as they tend to sacrifice "absolute security" in the name of greater "usable security" (often by centralizing the trust system in some way).
I was misremembering a bit. They said “secure FTP” and then just said FTP thereafter.
"I know, hence my edit. In my mind I read secure FTP -> SFTP, damn brain. But after rereading, I'm thinking they meant FTPS.
FTPS suffers from all the same problems as FTP (lack of definitive standard for FTP commands nor their returns, poor client server relationship due to the data port being a callback, no standard for native compression, etc) bar the clear text problem* . But ironically the encryption then creates a new issue with clients behind NATing due to the aforementioned lack of client/server distinction.
I don't think there is really much excuse not to run SFTP instead of FTP(S) given that most clients already support it. Or use one of the numerous other sharing / transfer protocols available to us now.
* unless you misconfigure FTPS that is
SFTP requires the use 'real' users (maybe not with fancy daemon software or a headache of PAM rules).
Equally a lot of Linux FTP daemons defaults to using /etc/passwd users too.
So What you're describing is a configuration detail the administrator needs to be aware of regardless of what transfer server they set up. In that regard FTP is no different from SFTP (or vise versa)
It's worse than that: FTP has come to mean file transfer generically. Lots of users refer to Dropbox as "FTP".
On the one hand, SFTP has built-in support for key-based security (client authenticates with its own key, and checks the server's key), but the protocol itself is a mess that pretty much assumes the server is a UNIX box with a typical file system. We had the interesting challenge of writing a windows-based SFTP server that stores data on Azure Storage, and even the most surprising things (like one command to create an empty file, followed by another command to upload to that file) turn into a silly mess, not to mention the difficulty of setting up an SSH server on Windows.
FTPS gets those things right: security is based on SSL (more implementations available) and most file-altering operations are done with single commands. On the other hand, it's still password-based (although encrypted) and the data-channel-versus-control-channel problem becomes a mess with load balancing.
Of course, you can use WebDAV (but it's not HTTPs) or do kinda-FTP-over-HTTPS by listing directories in a specific format (but it's not a standard anymore).
Hey should I use SFTP or FTP
Nah, FTP is fine
In other words, anyone offering FTP as an integration path in 2017 is most likely wide open for a host of other security holes.
bittorent => not available out of the box and prohibited in all companies
The other common way for me to upload or download files is with version control. Imagine if we integrated git into the browser and server to have something vastly more sensible.
It's true that people resist change, but the change they have to swallow here is so tiny and the upside so massive that it's darn close to malpractice to continue giving them the option to use plaintext FTP.
Insight: Small businesses don't like change when things still "just work" especially when changes only seem to add complexity and costs.
The better alternative is WebDAV but that is one of the last successful Microsoft 'Embrace, extend, extinguish' efforts.
Although SFTP works fine as well, but if you don't need security it's not any better, just equivalent.
Activate directory listing and `wget --recursive`.
Worse, you have to secure your server to make sure .php files in someone's home directory don't accidentally get run.
By default a web server will run files - unless you specially configure it not to, FTP never does that. It's a LOT harder to secure it with HTTP. And if you have places where you want stuff run, and other places where you want files transferred? It's possible, but your likelihood of making a mistake in the configuration is very very high.
HTTP is really not suited for bulk file transfer. SFTP works though.
You can't really run anything using default nginx configuration, and even after some tweaking, you still need some FastCGI/WSGI/whatever-compliant daemon to pair it with to be able to "run something". Same goes for `python -m http.server`, except you basically cannot make it execute anything at all.
Obviously, turning that into an automated process that runs from a Windows Scheduled Task (or worse) is not as easy.
The only thing that keeps FTP around is its ubiquity.
> Does it have resume capabilities? Nope.
See RFC 959,
> RESTART (REST)
> The argument field represents the server marker at which
> file transfer is to be restarted. This command does not
> cause file transfer but skips over the file to the specified
> data checkpoint. This command shall be immediately followed
> by the appropriate FTP service command which shall cause
> file transfer to resume.
And it would seem the definition from the RFC isn't even what's commonly implemented anymore, per https://cr.yp.to/ftp/retr.html (ctrl/cmd+f "The REST verb" no quotes)
Finally, if we're including HTTPS and comparing HTTP v FTP, why not include FTPS? https://en.wikipedia.org/wiki/FTPS (edit: probably because Chrome doesn't support it. Heh.)
IIRC FTP clients throughout the '90s were actually more reliable than browsers at resuming downloads.
You would need SFTP at least. But at least the setups that I've seen SFTP wants you to be a user on the remote server, while FTP is easier to have FTP only users (who are not actual users, they are just logins that are valid for FTP).
So you can have an user, which is defacto "on the remote server", but the only thing he can do is to login into specified sftp service.
For smaller scales, it is surely possible to do something similar, without FreeIPA, only with PAM.
Just because the technology is physically capable of doing it doesn't mean it's at all usable.
That said HTTP has an aura of atomic resource download, while FTP comes with :drumroll: a file system access aura. I barely never upload anything in shell through HTTP. And before the era of HTML drag n drop, it wasn't in my mind to upload through a browser over HTTP either. That changed a bit though. Still my favorite thing to upload over the web is webtorrent.
I'm not sure why WebDAV is less popular. It has its oddities and implementation issues but it's nowhere close to FTP weirdness. And AFAIK WebDAV support is either built-in or readily available to install in all mainstream OSes.
Next I want to see if FTP over OPENVPN is faster.