Hacker News new | past | comments | ask | show | jobs | submit login

This[0] is the Docker Python using Debian Bookworm, so as soon as 3.13.0 (not the release candidate I've linked to) is released, there will be an image.

Otherwise, there's always the excellent `pyenv` to use, including this person's docker-pyenv project [1]

[0] https://hub.docker.com/layers/library/python/3.13.0rc3-slim-... [1] https://github.com/tzenderman/docker-pyenv?tab=readme-ov-fil...




Hmm.. I think this is a misunderstanding.

What I meant is: While I am already inside a container running Debian, can I ...

    1: ./myscript.py
    2: some_magic_command
    3: ./myscript.py
So 1 runs it under 3.11 (which came with Debian) and 2 runs it under 3.13.

I don't need to preserve 3.11. some_magic_command can wrack havoc in the container as much as it wants. As soon as I exit it, it will be gone anyhow.

The in a sense, the question is not related to Docker at all. I just mentioned that I would do it inside a container to emphasize that I don't need to preserve anything.


You can use pyenv to create multiple virtual environments with different Python versions, so you'd run your script with (eg) venv311/bin/python and venv313/bin/python


The magic command in other settings would be pyenv. It lets you have as many different Python versions installed as you wish.

Pro tip: outside Docker, don’t ever use the OS’s own Python if you can avoid it.


> don’t ever use the OS’s own Python if you can avoid it.

This includes Homebrew's Python installation, which will update by itself and break things.


Yep. I only have Homebrew's Python installed because some other things in Homebrew depend on it. I use pyenv+virtualenv exclusively when developing my own code.

(Technically, I use uv now, but to the same ends.)


> Pro tip: outside Docker, don’t ever use the OS’s own Python if you can avoid it.

Why not?


It's unlikely that the OS's version of Python, and the Python packages available through the OS, are going to be the ones you'd install of your own volition. And on your workstation, it's likely you'll have multiple projects with different requirements.

You almost always want to develop in a virtualenv so you can install the exact versions of things you need without conflicting with the ones the OS itself requires. If you're abstracting out the site-packages directory anyway, why not take one more step and abstract out Python, too? Things like pyenv and uv make that trivially easy.

For instance, this creates a new project using Python 3.13.

  $ uv init -p python3.13 foo
  $ cd foo
  $ uv sync
  $ .venv/bin/python --version                                                                         
  Python 3.13.0rc2
I did not have Python 3.13 installed before I ran those commands. Now I do. It's so trivially easy to have per-project versions that this is my default way of using Python.

You can get 95% of the same functionality by installing pyenv and using it to install the various versions you might want. It's also an excellent tool. Python's own built-in venv module (https://docs.python.org/3/library/venv.html) makes it easy to create virtualenvs anytime you want to use them. I like using uv to combine that and more into one single tool, but that's just my preference. There are many tools that support this workflow and I highly recommend you find one you like and use it. (But not pipenv. Don't pick that one.)


This is the conventional wisdom these days, and a real thing, but unless you are admin challenged, running your local scripts with the system Python is fine. Been doing it two decades plus now.

Yes, make a venv for each work project.


If you're just writing tiny scripts for yourself, sure use the system Python.

If you're doing work on a large Python app for production software, then using system Python isn't going to cut it.


This is a restatement of my post with a restriction to tiny scripts. I don't restrict myself to _tiny_ scripts, some of my libraries are substantial.

Keeping the number of dependencies reasonable is probably the most important factor, rather than lines of code.


Ignore the talk below about pyenv, it’s not even slightly suitable for this task.

You want precompiled Python binaries. Use “uv” for this, rather than hacking it together with pyenv.


Use either one. `pyenv install 3.x` is slower than `uv python install 3x`, but that's not the most common operation I use either of those tools for. Uv is also comparatively brand new, and while I like and use it, I'm sure plenty of shops aren't racing to switch to it.

If you already have pyenv, use it. If you don't have pyenv or uv, install uv and use that. Either one is a huge upgrade over using the default Python from your OS.


This makes sense for a desktop environment, but for a disposable testing container there is no way that building and compiling each version of Python like that is a sensible use of time/resources.


For that kind of thing I'd always either used the tagged Python images in Docker Hub or put the build step in an early layer that didn't have to re-run each time.

One other advantage is that you know the provenance of the python executable when you build it yourself. Uv downloads a prebuilt exe from https://gregoryszorc.com/docs/python-build-standalone/main/ who is a very reputable, trusted source, but it's not the official version from python.org. If you have very strict security requirements, that may be something to consider. If you use an OS other than Linux/Mac/Windows on x86/ARM, you'll have to build your own version. If you want to use readline instead of libedit, you'll have to build your own version.

I am personally fine with those limitations. All of the OSes I regularly use are covered. I'm satisfied with the indygreg source security. The libedit version works fine for me. I like that I can have a new Python version a couple seconds after asking for uv. There are still plenty of reasons why you might want to use pyenv to build it yourself.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: