Oh wow, today I learned about env -S - when I saw the shebang line in the article, I immediately thought "that doesn't work on Linux, shebang lines can only pass a single argument". Basically, running foo.py starting with
#!/usr/bin/env -S uv run --script
causes the OS run really run env with only two arguments, namely the shebang line as one argument and the script's filename as the second argument, i.e.:
/usr/bin/env '-S uv run --script' foo.py
However, the -S flag of env causes env to split everything back into separate arguments! Very cool, very useful.
It seems to me that macOS has env -S as well, but the shebang parsing is different. The result is that shebang lines using env -S are portable if they don't contain any quotes or other characters. The reason is that, running env -S 'echo a b c' has the same behavior as running env -S 'echo' 'a' 'b' 'c' - so simple command lines like the one with uv are still portable, regardless of whether the OS splits on space (macOS) or not (Linux).
True, this should be fine for straightforward stuff but extremely annoying as soon as you have for eg, quoted strings with whitespace in it which is where it breaks. Have to keep that difference in mind when writing scripts.
The link I posted in my original reply has a good explanation of this behavior. I was the one who asked the question there.
I'm aware of this package for getting other utilities but:
1. I'm worried about this conflicting/causing other programs to fail if I set it on PATH.
2. This probably doesn't fix the shebang parsing issue I mentioned since it's an OS thing. Let me know if that's not the case.
Been doing it for more than a decade and yet to get in trouble. Not one issue. Doing it consistently for my teams as we decrease cognitive load (developing on macs but targeting unix). Others would confirm https://news.ycombinator.com/item?id=17943202
Basically software will either use absolute paths i.e. wants to use your OS version for a dependency like grep, or will use whatever grep is in your $PATH and stick to safe invocations regardless if it's BSD/GNU or if it's version x or y
Hmm, I haven’t run this experiment myself but I have in the past faced problems overriding default python/ruby commands in PATH that caused some stuff to fail and had to add some specific overrides for `brew` command, for example.
> Basically software will either use absolute paths
I’ve personally written scripts that break this assumption (that’s a me problem, I guess) so I am quite sure there’s a lot of scripts at the very least that do this.
Nevertheless, you’ve given me something to consider.
The PATH is irrelevant, this is about how the kernel parses the shebang. It starts exactly /usr/bin/env with two arguments, not some other env binary you might have in your PATH.
- The second line is treated like a commend by the shell (and Tcl)
- The third line is executed by the shell to run Tcl with all the command line args. But then Tcl treats it as part of the second line (a comment).
Edit: Doing a bit of web searching (it's been a while since I last had the option to program in Tcl), this was also used to work around line length limitations in shebang. And also it let you exec Tcl from your path, rather than hard code it.
If the wrapper itself cooperates, you can also embed more information in the following lines. nix-shell for example allows installing dependencies and any parameters with:
env -S should never have been necessary. The strange whitespace splitting rules of the shebang line is an old bug that has matured into an unfixable wart marring the face of Unix forever. Every time I have to use tricks like the above, I'm reminded that half an hour of work in the 1980s would have saved years of annoyance later. Shebang lines should have always split like /bin/sh.
However, as another commenter wrote, support is not universal (looks present in RH8 but not RH7 for instance). Also, the max length of a shebang is usually limited to about 127 characters.
So sometimes you have to resort to other tricks, such as polyglot scripts:
/usr/bin/sh
"""exec" python --whatever "$@"
Well this is still a Python docstring
"""
print("hello")
Or classically in Tcl:
#! /usr/bin/sh
# Tcl can use \ to continue comments, but not sh \
exec tclsh "$@" # still a comment in Tcl
puts "hello"
Such things are not usually needed, until they are, and they make for fun head-scratching moment. I would personally recommend against them if they can be avoided, as they are relatively fragile.
I'll leave the self-compiling C language script "shebang" as an exercise to the reader ;)
I believe parent comment was about `env -S` not being portable rather than `uv` being portable.
I'll say, I am as pessimistic as the next person about new ways to do X just to be hip. But as someone who works on many different Python projects day to day (from fully fledged services, to a set of lambdas with shared internal libraries, to CI scripts, to local tooling needing to run on developer laptops) - I've found uv to be particularly free of many sharp edges of other solutions (poetry, pipenv, pyenv, etc).
I think the fact that the uv tool itself is not written in Python actually solves a number of real problems around bootstrapping and dependency management for the tool that is meant to be a dependency manager.
> I think the fact that the uv tool itself is not written in Python
It's interesting that the language people choose to write systems with (Python) is basically identified as not the best language to write systems to support that language (Python).
To my knowledge, no other mainstream language has tooling predominantly written in another language.
Javascript has quite a lot of tooling written in other (better) languages.
I think Javascript and Python stand out because they are both extremely popular and also not very good languages, especially their tooling. You're obviously going to get a load of people using Javascript and Python saying "why is Webpack/Pip so damn slow? I don't have any choice but to use this language because of the web/AI/my coworkers are idiots, so I may as well improve the tooling".
It's important to use any other language to avoid even the theoretical possibility of bootstrapping complications. All languages that produce self-contained compiled executables are equally suitable for the task.
It's, er, "funny" how people used to make fun of `curl | sh` because of how lame it was, and now you have it everywhere because Rust decided that this should be the install.
You can also install rustup via your package manager and then use it as usual to manage your Rust installations. Though I guess in most cases, a single moderately recent Rust installation works just fine. But it's useful if you want/need to use Rust nightly for example.
uv is the tool, finally. We've been waiting for two decades and it really does basically everything right, no ifs or buts. You can scratch off the 'yet another' part.
uv is not just a dependency tool. uv deals well with packages and dependency management (well), venvs, runtimes, and tools. It replaces all the other tools and works better in just about every way.
> Oh, yet another python dependency tool. I have used a handful of them, and they keep coming
Yeah that's my opinion of all the other Python dependency tools, but uv is the real deal. It's fast, well designed, it's actually a drop-in replacement for pip and it actually works.
> I guess no work is important enough until it gets a super fast CLI written in the language du jour and installed by piping curl into sh
Yeah it's pretty nice that it's written in Rust so it's fast and reliable, and piping curl into sh makes it easy to install. Huge upgrade compared to the rest of Python tooling which is slow, janky and hard to install. Seriously the official way to install a recent Python on Linux is to build it from source!
It's a shame curl | bash is the best option we have but it does at least work reliably. Maybe one day the Linux community will come up with something better.
What we have now, a load of different people developing a load of new (better!) tools, is surely what the PyPA had in mind when they developed their tooling standards. This is all going to plan. We've gotten new features and huge speedups far quicker this way.
I don't like changing out and learning new tools constantly, but if this is the cost of the recent rapid tooling improvements then it's a great price.
And best of all, it's entirely optional. You can even install it other ways. What exactly was your point here?
Not using shebang, but I've recently been using uv as an "installer". It's hard to package and distribute python CLI tools with complex dependencies. I used uv in a couple of novel ways:
1. I copied their `curl | sh` install script and added a `uv tool install --python python3.12 my-tool` line at the end. So users can now install my CLI tool with a nice curl-pipe-sh one-liner.
2. I made a tiny pypi "installer" package that has "uv" as its only dependency. All it does is `uv tool install` my CLI tool.
Both methods install the CLI tool, python3.12 and all python dependencies in their own isolated environment. Users don't need to manage python virtual envs. They don't need to even have python installed for the curl-pipe-sh method.
I now get far fewer GitHub issues from users who have somehow mangled the complex dependencies of my tool.
I have not checked how that works under the hood, but
> Both methods install the CLI tool, python3.12
Won't that require some compiler (GCC?), kernel headers, openssl-dev/gzip/ffi/other_lib headers to be present on end user system and then compiling Python?
At least that's my experience with ASDF-VM, which uses someother Python-setup toolkit under the hood.
Speaking of funny shebangs, I came up with this to shell execute prolog .pl files, but it should work for any scripting language that has /**/-comments but doesn't support #-comments or #! shebangs specifically:
> The way this works is that this test.pl is both a valid shell file and prolog file. When executing as shell the first line finds and executes /usr/bin/env after searching for the glob pattern /*usr/bin/env. env then executes scryer-prolog "test.pl" which runs the prolog file as a module and halts; of course prolog ignores the first line as a comment /* ... */. Then the shell continues and executes the next command after ; which is exit which stops execution so the rest of the file (which is not valid shell) is ignored. The # makes the shell evaluator ignore prolog comment closer */ to prevent it from printing an error.
This may be the best and worst thing I have ever created.
I recently switched over my python aliases to `uv run python` and it's been really quite pleasant, without needing to manage `.venv`s and the rest. No fuss about system installs for python or anything else either, which resolves the old global/user install problem (and is a boon on Debian). Also means you can invoke the REPL within a project/environment without any `activate`, which saves needing to think about it.
Only downside for calling a .py directly with uv is the cwd relative pathing to project/environment files, rather than relative to the .py file itself. There is an explicit switch for it, `--project`, which at least is not much of an ask (`uv run --project <path> <path>/script.py`), though a target relative project switch would be appreciated, to at least avoid the repetition.
I e experienced a few “gotchas” using uv (or uvx) as a command runner, but when it works (which is most of the time) it’s a big time saver. As a Python dev and curious person my homedir is littered with shallow clones and one-off directories for testing some package or another, and not having to manage this overhead is really useful.
OP, you have a great idea and I’m stealing it. Do you use some non-.py suffix, or is seeing the exec bit at on a .py file enough of a signal for you to know that you can just run the file as a script (and it won’t use the system Python)?
I've been collecting gotchas from using uv myself over the last year to assess whether to recommend it or not and I found surprisingly few. Not zero, but I'm looking pretty hard.
Would love to hear what gotchas to find so I can add that to the list.
I'll write an article on bitecode.dev at the one year mark (about end of feb) about the various situations I tested it in with clients and the results.
I will include a section about the things that it was not good for and the problems with it. Spoiler, there were not that many and they fixed them astonishingly well (even one this week!). But obviously there were some, no software is perfect although they worked hard to not make it too opinionated for the first taste.
* A recently-published PEP specificies how Python scripts can declare dependencies in opening comments.
* uv is a Python script-runner/package manager which scans for those dependencies, arranges for them to be met, and runs the script with those dependency modules available for importing
* If, in a Python script, you use the first line comment - the shebang line - to have the file invoked using uv rather than python itself, you'll get that effect automagically when "running" the Python script
The result: Python scripts which require setup of dependencies will now "just run".
Also, uv will install the dependencies, but I am not clear on whether or not it will automatically install a Python interpreter for you using this method.
Will now “just run” after you satisfied the dependency of installing uv on your system and prepending these dependency lines to all your scripts, that is.
Imo this saves you no time over just constructing and calling environments the old way. I’m not sure either whether uv lets you load the env only in an interactive session or if its just for script running.
Ofcourse it does, after you have installed uv and prepending the dependencies you never have to do it again. Calling environments etc have to be done every time. You also have to keep these env around.
You could easily wrap something that activates an env as the relevant script is called. And you can just store the env in text and build as needed too.
You can also do the same thing with the nix package manager and pretty much any language or dependency. Like if you wanted a portable bash-script with imagemagick
Does anyone have any intro to UV guides they found helpful?
I haven’t dove into uv as it just hasn’t been at the top of my list, but I will probably take the plunge soon.
I gave it an initial try thinking it might function as a drop in, but didn’t find any great guides, and just haven’t been back yet. But we’re ~8% into 2025 so it’s time.
One gotcha that annoys me when do this... The file still needs to end in .py which I generally don't like for the bins and utilities in my path. Everyone using these shebangs either doesn't hit this or never mentions it. Entirely a stylistic personal preference but still makes me a sad lady.
Oh nice, that's really cool. I'm seeing uv pop up in more and more places. Python really needed this tooling upgrade. Also, I'll make a mental note of -S, awesome.
If you want to do the same with javascript or typescript, you can just use bun, it will auto-install[1] the packages included in there.
Astral are making great tools. There are many points of friction in the Python world and Astral is making them seem obvious to see and obvious to solve. That is hard work.
This reminds me of an old Perl magic spell for invoking it within a Bourne/POSIX shell wrapper along with allowing arbitrary command line arguments[0]:
#!/bin/sh
#! -*- perl -*- -p
eval 'exec perl -x -wS $0 ${1+"$@"}'
if 0;
# It's Perl code from this point.
I really don't understand how it is that other people frequently write one-off scripts that nevertheless need to run in an isolated environment (as opposed to a sandbox that contains a few common workhorse packages like NumPy or Requests).
Of course I have plenty of separate environments set up, but that's because I'm developing things I hope to share with others, so I need to keep close track of their specific dependencies (so I can specify them in package metadata). But even this stuff would generally work with a wide range of versions for those dependencies. I'd imagine this is equally true of small, one-off, personal automation scripts.
But, sure, if you have this problem, that's a great way to use the shebang line. (You can do similarly with Pipx, and I expect you'll be able to with Paper as well, although I might push that sort of thing into an optional add-on.)
If my weird one-off ad-hoc script needs numpy I can bake the exact tested version into an inline script dependency and run it with "uv run" and never have to think about it ever again.
If I don't do that, there's a good chance that in a year I'll upgrade numpy for some other project and now my ad-hoc script will break.
(Numpy is actually a great example here, lots of code I try to run turns out to need numpy 1.0 because it hasn't yet been upgraded to work with the new 2.0 - here's a recent example https://simonwillison.net/2024/Dec/24/modernbert/ )
They don't care about the isolated environment. They care about having the right dependencies available to the script.
Competing languages like Java don't need or even have the concept of isolated environments. It's an illogical concept. You have a class path and that is about it.
The isolated environment can matter when different scripts that you use depend on libraries that have conflicting dependency requirements.
> Competing languages like Java don't need or even have the concept of isolated environments
The nice thing about scripts is that you can quickly add new capabilities when you need them. They are always under development. To get the same thing in Java, you need a project directory with a build file, which gives you an isolated build environment.
>They care about having the right dependencies available to the script.
Right; my point is that I'd expect one-off scripts like this to keep relying on the same few standard dependencies which would already have been set up in an environment somewhere.
>You have a class path and that is about it.
A Python venv pretty much is just a layer on top of the existing "class path" mechanism (sys.path) to point it at the right place automatically (reproducing the basic folder layout relative to a symlinked executable, so that sys.path ends up with the right paths and so that there's a dedicated folder for installing the dependencies). While it's ugly and not recommended, you can work with the PYTHONPATH environment variable, or even modify sys.path at runtime.
The isolated environment is required because most Linux systems also depend upon Python, and can have dependency conflicts if you install development dependencies that conflict with system dependencies.
I've been meaning to look into uv. However, since I make pretty heavy use of conda for my pythons scripts, I'll usually add a bash function that handles conda activation / code execution to my .zshrc file.
function transcribe() {
# get the active conda environment
active_env=$(conda info | grep "active environment" | awk '{print $4}')
if [[ "$active_env" != "speechrec-env" ]]; then
conda activate speechrec-env
fi
python /users/vunderba/dev/transcriber/transcribe.py "$@"
}
It's handy because if it'll always point to the most current version of my script.
What seems to suck about the uvx way of doing things is that your script and dependencies are now in one document. It just seems so much more useful for so many reasons to keep these things discrete in my mind. Especially when we are talking about uvx which is something probably only a fraction of python devs actually use vs a built in paradigm of the language. Yes, conda is not built in but it might as well be with the amount of influence it has on python development and downstream tools to manage conda envs (e.g. mamba or snakemake).
That's a misundersanding of uv - the inline script dependencies feature is strictly optional. You're encouraged to use pyproject.toml as a default instead of that for most projects. https://docs.astral.sh/uv/guides/projects/
That's not the uvx way? Dependencies at the top of a script was outlined in PEP 723, which uv/uvx added support for. Not everything is a "project", some files are just one-and-done scripts which should not have to carry the burden of project/environment management, but absolutely should still be able to make use of dependencies. The "uvx way" of doing it just means that it doesn't have to pollute the global/user install, and can even be isolated into a separate instance.
Besides, not everyone uses conda, and it would be quite a stretch to say it "might as well" be a built-in compared to, well, the actual built in, pip! Plus, uv works quite nicely as "just a pip replacement", which is how I started with it, so it aligns quite well to the actual built-in paradigm of the language.
I love having little custom command line apps in my ~/bin directory. I used to, a very long time ago, use Gambit Scheme for this, then moved on to Common Lisp command line apps. More recently I have been using Python for ~/bin scripts but since I use venv for each individual project on my laptop, having a global Python setup with dependencies for ~/bin is annoying. Using uv seems like a great idea, which I will try today: use uv for all command line apps and venv for all other Python coding projects.
I love using this technique, but developing these scripts has always bothered me because there’s no code completion without an actual `venv` to point my editor to. Has anyone found a way around this?
Uv's support of inline metadata is super handy. What if the person running the scripts doesn't have uv yet though? For fun I wrote a double shebang script that will handle that too:
https://paulw.tokyo/standalone-python-script-with-uv/
I've been using uv as a drop-in replacement for poetry. My little usage of it so far has led me to view it as just a better poetry. Does the same things I wanted from poetry, but faster and I haven't hit the dependency resolve issues I hit with poetry in the past (tbf, I think these got fixed in poetry since).
How likely is this to be another npm/yarn situation? I have used a number of Python tools that in the long run turned out to be less portable and less usable than plain Pip and virtualenv. I do use pyenv for a personal setup but not for any production use.
You can use uv and get the benefits today: it combines the functionality from several tools (including pyenv) and the performance is such that it is transformative.
it's the same for uv/pip/poetry except uv is so much better than the alternatives there isn't any contest. pip is the safe default (which doesn't even work out of the box on debian derivatives, which is half the issue) and uv is... just default.
As a Debian user that wanted to use a cli tool only available from npm it was horrible trying to find some sane instructions that didn't assume intimate knowledge of node package structures.
I did eventually figure out I could just do `corepack pnpm setup` then install packages globally with pnpm.
Seems that it interprets a special comment block, then automatically creates a python venv and gathers dependencies. As the post[1] says, it obviates "faffing" with venv and pip to run a script.
That works reliably for scripts that use standard library only and you also have to be careful not to use features incompatible with various minor versions of Python 3.