Hacker News new | past | comments | ask | show | jobs | submit login

This is more of a packaging problem. There are various ways to package python code with it's dependencies into a single executable.

In any case, comparing a stdlib library to a 3rd party one is a bit apples to oranges. Most people first decide whether or not they want to use pypi packages, and then start evaluating which ones are appropriate.




> Most people first decide whether or not they want to use pypi packages, and then start evaluating which ones are appropriate.

I don't think it's as binary as this. There should be, at least intuitively, a mild negative bias to adding a marginal dependency, even once you're dependent on pypi. We're lucky enough to not be dealing with a dumpster fire ecosystem like nodejs, but it's still a good habit.

For me, writing your interface with click doesn't seem too much better than writing it with argparse, certainly not enough to get over the (small) activation energy of a non-stdlib dependency


Click is multiple order of magnitude better than argparse. I don't even start a CLI app with argparse anymore, because even with very simple interfaces you will hit it's limitations. You should try click.


Interesting, thanks, I had just skimmed the posted article and it looked substantially similar to argparse, but I guess I'll take a closer look.


It is a bit clunky but haven’t hit a limit since Py 3.4 or so.


sure, if you're building a CLI to do management commands for an application that already has dependency management in place, that's all fine

but if you just have some basic script to scrape some logs or zip up files, its nice to have it be self contained


if it's a simple basic script, i wouldn't even go through the pain of using argparse. i would just manually check sys.argv.


You ALWAYS need a CLI framework at least for generating help automatically, otherwise I hate you when I want to use your frugal tool and I have to look into it to find out how it works.


In which case you would be using click. If I have any expectation that someone else is going to use my script I give it a REAME.md, a requirements.txt, a setup.py and I use click. It literally takes about 30's to do and you now have a cli interface indistinguishable from any other cli on your system.


I've generally found argparse to be worth it the minute that you need anything other than two args, whose usage implies an intuitive ordering (eg with mv)


Perhaps a non-starter for you if you don't want to involve Docker but I have had good experiences wrapping a docker invocation in a shell script shim with the same name as the program and forwarding everything I need into the container. Then folks just grab the shell script and they're off... added bonus it's really easy to add update functionality to your tools.


So you're replacing a single executable Python script with a dependency that requires four additional separate artefacts (Dockerfile, image, container and the shell script) to be executed...


It's like you conveniently missed the first sentence.

1. Nothing stays a simple as time happens. New shit gets added.

2. Eventually people do want to use libraries so you're back to square 1.

3. It's "cross-platform" runnable now so those annoying macOS and Windows users can suddenly use it. Though depending on how you do the shim it might be tricky. I've taken to writing the shim in Go lately so I can poop out a static binary that does the rest of the work.

4. It's really easy to keep stuff up to date if your users are non-technical... simply have the shim docker pull a new version on startup.

To the end user it requires just a single artifact. For the developer there's a bit more stuff to manage but it's not exactly like any of this is hard stuff to figure out.


As well as Docker being installed, and configured, which implies admin access to the hardware you're running it on




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: