Hacker News new | past | comments | ask | show | jobs | submit login
Show HN: A tool to upload/download files from command line (bashupload.com)
79 points by ruhighload 16 days ago | hide | past | web | favorite | 48 comments

For people that prefer privacy, I've been building a fully featured CLI tool for Firefox Send. Maybe someone is interested: https://github.com/timvisee/ffsend

This is awesome. You should share it on ShowHN!

Thanks! Yes, I was planning on doing so once I've proper pre-built binaries available. I might do it in sync with a new upcoming Firefox Send release.

Cool stuff, thanks !!

If this is yours, I recommend adopting some sort of inverse relationship between file size and retention time. 0x0 does this and I think it makes a lot of sense: https://0x0.st/

    retention = min_age + (-max_age + min_age) * pow((file_size / max_size - 1), 3)

    365 |  \
        |   \
        |    \
        |     \
        |      \
        |       \
        |        ..
        |          \
  197.5 | ----------..-------------------------------------------
        |             ..
        |               \
        |                ..
        |                  ...
        |                     ..
        |                       ...
        |                          ....
        |                              ......
     30 |                                    ....................
          0                        256                        512

I can totally write a script to divide my large files into smaller parts before uploading. That way I will get unlimited storage?

> That way I will get unlimited storage?

No. That way fill up his hard drive, break the server and ruin it for everybody. Don't be that guy.

I think GP gets that, and is pointing out a potential problem. Your solution of not being an asshole works for small groups, but not the entire internet.


I don't see what the correlation is between a user's file size and how long the user will need it to remain.

Generally people use these services to host a file for around the time it takes someone to check their e-mail/ticket/etc. to download the item -- hopefully within a day, usually a couple days, worst-case scenario a week.

The time-cutoff is based on human behavior.

Plus, it's too much mental overhead for a user to have to worry that each file uploaded lasts for a different amount of time.

For a host providing a free service that they have to pay actual money to run, and having a finite & non-advertised limit on disk space, it makes sense to put a heavier mental price on large files, so that a 1GB upload actually does feel heavier to the user than a 1KB upload.

I think it is to cut the host's cost on disk ? Not necessary for the benefit of user.

> Why?

The current policy of 25GB for seven days is very generous, but I fear it will prove unsustainable.

Did you type this graph out by hand?

Lol, proxy all my file uploads through your servers, no thanks!

Seriously how is this tool even getting upvoted? Why use this compared to straight up curl or scp to your own instance? This makes 0 sense, I need an aspirin.

This is for situations when you can't scp your file, like when I use a jump server to go through some network restrictions.

Then you can upload to your own jump server first. This is just a bad idea on multiple levels.

I usually recommend https://file.pizza to non-tech people who want to send large files, as it is a very easy-to-use solution. The sender adds the file and then sends the file link to the recipient who simply loads it in their browser to download the file.

Files are sent peer to peer and are encrypted. The main downside is that the sender has to keep the browser tab open until the recipient receives the file.

Yea, the only feature file.pizza is missing is a download complete alert to uploader so they can close the tab.

This is something I have been looking for and couldn't find to the point that I was considering just building it myself. Thanks.

Nice idea. I guess you can make a nice extra buck from uploads containing sensitive personal/financial information or intellectual property.

You could also encrypt them before uploading them (even if most people won't do it)

Firefox send is mentioned elsewhere in the thread. It does that by default, and keeps the key as a uri fragment so that the servers don't see it. I don't know why this tool doesn't do something similar.

No thanks. scp works just fine for that purpose.

Interesting, though I just prefer using a S3 bucket myself. I have a small script that uses curl and openssl under the covers to upload so I can upload from random machines I don't want to install AWS tools on.

This is really the best solution. It's next to no cost for short-term upload, especially if the files aren't that big. However, you're protected by one of the largest and most secure public cloud platforms rather than some random server.

Anyone find a privacy policy anywhere? I went to the mother site listed at the bottom, but it is entirely in Russian. Tried translating privacy to Russian and searching various terms, but getting nothing.

Edit: nvm I found it

"Bashupload has no control over, and assumes no responsibility for, the content, privacy policies, or practices of any third party web sites or services. You further acknowledge and agree that Bashupload shall not be responsible or liable, directly or indirectly, for any damage or loss caused or alleged to be caused by or in connection with use of or reliance on any such content, goods or services available on or through any such web sites or services."

So basically just a copy-paste and search replace for their name. This is boilerplate, what were you expecting?

Have to mention https://wsend.net here

What's wrong with good old-fashioned wget?

Wget is not a “tool” - a “tool”, in modern parlance, now has to be a web service, which is what this is.

This site isn't a tool, because it doesn't come with an npm module, smh.

Can `wget` easily upload files? More like `scp`, isn't it?

It's probably a decade that I use woof[0] to "send" files on a LAN. It's packaged for Debian but essentially it's a Python HTTP/1.0 web server in a single file without external dependencies. With an option it can distribute itself and it can also offer an upload form to receive files...

[0] http://www.home.unix-ag.org/simon/woof.html

IIRC "Writing" an HTTP server is close to a one-liner in Python anyway.

Well, launching "python -m http.server" should be enough to serve the content of the current dir so you are right. To be honest I never looked what that script does...

you can use curl to upload files

Did you click the link? It's not offering an alternative to wget or curl, in fact it recommends you use curl. This site is a file host which you can upload to with curl.

Did you read the parent? He was asking about wget

Pretty sure that was rhetorical.


wget invokes flash and javascript? Not sure if I follow this, please enlighten me.

And how does one compare Nginx and curl, one is a webserver, the other a Swiss knife for mainly doing client side requests to various protocols, http/https being one of them.

Lynx is a browser. Wget is not, but for different reasons.


Are you really bored enough to create 2 temp accounts to post nonsensical comments?

There’s croc[1] too, which is basically the same thing but written in Go.

[1] https://github.com/schollz/croc

I tried uploading a 60MB file to those four services and apart from needing to upgrade my curl they all went fine, returned the url from curl and file was directly downloadable with wget/curl from the url provided.

Either I'm unfamiliar with the word or there's a typo in the "Links and Files" section of the disclaimer: 'forbinned' instead of 'forbidden'

Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact