Hacker News new | past | comments | ask | show | jobs | submit login

Yeah. I've heard this same comment before in different forms from "Python apologists" about a dozen times. In practice it's still a lot of hair-pulling, because your entire comment assumes that EVERY Python project is already doing things exactly this way and hasn't screwed up a single part of it: https://twitter.com/pmarreck/status/1735363908515295253

but also...

> but stuff like this would usually get installed globally

well, you just killed (apparently unknowingly?) your whole argument right there, because globals are bad and absolutely not project-specific and absolutely do cause compatibility issues between different Python projects

If you ever come around to Nix, it takes care of this problem for good (as in, it guarantees that you will never have 2 projects that step on each other), across every ecosystem, not just Python's. Unfortunately, I don't see very many Python projects at all that contain a flake.nix file, which is a damn shame, because it would cause people like me to hate Python just a tiny bit less




Hair pulling? It's the most barebones unix thing you can imagine. If you understand unix principles, you can use virtual environments and python.

It used to be hard. A global site-packages or dist-packages dir is hard. Conda and all those tools make it worse.

A virtualenv is a directory. It contains a copy of Python and everything needed.

You don't even need to activate it. You can simply use the binaries that exist in those directories.

It's literally this simple:

    $ python -m venv pmarreck
    $ pmarreck/bin/python --version
    Python 3.10.12
    $ pmarreck/bin/pip install harlequin
    Collecting harlequin
      Downloading harlequin-1.8.0-py3-none-any.whl (61 kB)
        ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 61.5/61.5 kB 1.2 MB/s eta 0:00:00
    ...
    Successfully installed MarkupSafe-2.1.3 click-8.1.7 duckdb-0.9.2 harlequin-1.8.0 jinja2-3.1.2 linkify-it-py-2.0.2 markdown-it-py-3.0.0 mdit-py-plugins-0.4.0 mdurl-0.1.2 numpy-1.26.3 platformdirs-3.11.0 prompt_toolkit-3.0.36 pyarrow-14.0.2 pygments-2.17.2 pyperclip-1.8.2 questionary-2.0.1 rich-13.7.0 rich-click-1.7.3 shandy-sqlfmt-0.21.1 textual-0.46.0 textual-fastdatatable-0.5.0 textual-textarea-0.9.5 tomli-2.0.1 tomlkit-0.12.3 tqdm-4.66.1 tree-sitter-0.20.4 tree_sitter_languages-1.9.1 typing-extensions-4.9.0 uc-micro-py-1.0.2 wcwidth-0.2.12

    $ pmarreck/bin/harlequin
Boom, the tool launched immediately. No conflicts with anything else. When I am done, `rm -rf ~/pmarreck`, done.

I did all of this in my home directory.

It literally cannot get easier than this.


And this works with every existing Python project out there? What about `requirements.txt` projects?


It works for installing dependencies from Pip, so yes, unless that Python project is doing something bizarre that it shouldn't be doing.

It's functionally identical to node_modules or Ruby Bundler or Perl local::lib.

It's so weird to me that people continue to hate Python for a problem that literally every programming language has had since shared libraries were invented.


requirements.txt is just a list of packages for pip. It doesn't even need to be called requirements.txt, it could be called poopoopeepee.txt. That name is just a convention.

You could do `for line in requirements.txt, pip install <line>`. They are ostensibly the same. It is not a magical lockfile. It is unix. It is just a list if packages. If you are in a virtualenv, you will be fine.

either activate the venv,

    source venv_directory/bin/activate
    pip install -r requirements.txt
or, skip the convenience and use the binary directly:

    venv_directory/bin/pip install -r requirements.txt
virtualenv activation just sets the $PATH to refer to those binaries. you can use them directly.

the production deployment for my core python app lives in /srv/app, the venv lives in /srv/venv

To update packages on the system, it is as simple as

   cd /srv/app
   /srv/venv/bin/pip install -r requirements.txt
Then to invoke this with the correct runtime, it is as easy as

   /srv/venv/bin/gunicorn ...
In this example I am running the gunicorn application server. This is running the specific gunicorn version installed into my virtualenv.

The name `/srv/venv` is my decision. You can call that whatever you want and put it wherever you want. For instance, if you have two projects called application-foo and application-bar, you can have the following:

    /srv/application-foo/app - the codebase (ie, the github repo)
    /srv/application-foo/venv - the corresponding virtualenv
    /srv/application-bar/app
    /srv/application-bar/venv
Some people will even put their venv dir inside of their source tree and exclude it from git (add to .gitignore), but I do not like this approach because my deployments destroy the app dir and unzip the build into that location on each deploy.

I cannot speak for every python-based project (distinction from pip package) out there. A lot of people do not know what they are doing, open source is literally all the rope to hang yourself with. Anything is possible, and people without engineering experience will glue things together without understanding how they work.

If you are installing something via pip, then yes, you can create N virtualenvs and use them however you want. They are 100% isolated environments.

If you are using homebrew, apt-get, dnf/yum, arch etc... then those are going to obviously vary from distro to distro and that is outside the scope of this discussion.

I try to stick to Python's native tools as much as possible. Using a distribution package is going to cause issues, for sure. IE, don't install `apt-get install python-pil`, use a virtual environment and `pip install PIL`


alright, I will bookmark this and try this next time I want to play with a python project.

OK, how would I include all of these under the same PATH regime?

So for example say I want to run this project from a commandline location elsewhere... I'd only be able to have one venv activated at the same time in the same session, right?

I guess that's part of my issue with this. I want to be able to access 10 different Python projects' commands from the same command line at any time.


To access 10 different commands at the same time, that is tricky but definitely doable.

First thing that comes to mind, you can use aliases.

To keep it simple, lets use 3 examples instead of 10: harlequin (this project), pgcli (https://www.pgcli.com/) and httpx (https://www.python-httpx.org/)

Setup a main home for all your venvs:

    cd ~
    mkdir venvs
Go into this dir, and create your venvs and install the packages

    cd venvs
    python -m venv harlequin
    ~/venvs/harlequin/bin/pip install harlequin
Now this binary is available at

    $ ~/venvs/harlequin/bin/harlequin
Repeat for the rest

    cd ~/venvs
    python -m venv pgcli
    ~/venvs/pgcli/bin/pip install pgcli

    cd ~/venvs
    python -m venv httpx
    ~/venvs/pgcli/bin/pip install httpx
Wash, rinse, repeat. Now you have all these binaries available and can alias them

    alias harlequin="~/venvs/harlequin/bin/harlequin"
    alias pgcli="~/venvs/pgcli/bin/pgcli"
    alias httpx="~/venvs/httpx/bin/httpx"
This is a pain in the ass though and usually simple CLI tools like this do not collide with each other. So that is why I say install globally, or install into your "global junk drawer" virtualenv.

Meanwhile, for actual projects that you are developing on, those would have their own isolated venv.

I have a junk drawer venv where I install tools like this. If something goes wrong, it is as simple as rm -rf the venv and make a new one. And then I have isolated ones for each of the actual systems I maintain. Again, I use pyenv for this to make it a little easier to manage in conjunction with their specific python versions such that I do not ever interact with my distribution's Python. This is cross platform so it works across mac, linux etc. Very easy workflow, isolated, safe, can get blown away and recreated in a heartbeat.


You know, at this point in the complexity story, you're literally at (or beyond) the level of Nix complexity which is the very solution that everyone who engages with this level of tooling seems to be trying to avoid, and since Nix solves this problem already definitively without having to jump through all these hoops, why don't Python projects just use Nix? Then they could all be colocated on the same machine and all be accessible from the same PATH and all have their specific dependencies none of which would ever collide with each other!

Like, you're LITERALLY making a FANTASTIC argument for Nix usage in the Python ecosystem, here. In fact I'm going to bookmark this conversation now because of how ridiculously complicated your answer is compared to just using Nix.

Here's the Nix whitepaper. It's 14 pages or so. Read it on your next lunch break.

https://edolstra.github.io/pubs/nspfssd-lisa2004-final.pdf


Sounds like something https://www.gnu.org/software/stow/ could help with.


Your Tweet shows that you don't actually know what these tools do. There's not much overlap in functionality between Pyenv, Tox, and Poetry, for example.

Also, nobody active in the Python community will argue that there's 1 correct way to do packaging. That's a serious straw man.

Fortunately, none of those tools you mentioned other than venv are actually required to run Python applications, and there are in fact exactly 2 recommended ways to install Python applications:

1) Use your system's package manager

2) Use a venv, either manually (as shown in the sibling comment) or using the Pipx tool, which just creates venvs for you.

All of the other tools you mentioned (except for Pyenv) represent ~20 years of active development and iteration on how to manage projects and build packages for distribution, and end-users shouldn't even have to be aware of their existence. And Pyenv is just Rbenv but for Python.

As I've pointed out elsewhere, this is exactly the same situation as with literally every other programming language that doesn't generate standalone executables, and is even a problem with those that do, if they rely on dynamic linking. The special ire towards Python in this case is neither warranted nor valid.

Your pinned post on Twitter is predicated on double standards and lack of basic understanding of the tools you're criticizing. I'm not sure that's a good way to represent oneself.


> Your pinned post on Twitter is predicated on double standards and lack of basic understanding of the tools you're criticizing

If it means that I get to see less Python, then I guess... Mission accomplished? LOL

My first, second and tenth experiences with Python were all negative. Everything from trying to exit the REPL the first time I used it and getting chastised for doing it wrong (which meant that it knew what I was trying to do, and instead of just doing that, decided to be a little snit about it, which is about the jerkiest attitude a tool can take), to the Python 2.x->3.x transition pains, to the significant whitespace, to every project stepping on the dependencies of every other project (you might argue that "I just didn't use venv right" but I did follow every package's installation instructions!), to... Well, just read this, he did a good summary: https://medium.com/nerd-for-tech/python-is-a-bad-programming...

There's literally nothing I like about it. It just seems like a poorly written, older Ruby with a lot of baggage and a minefield of gotchas (except that I like Ruby... or did, before Elixir). Ruby should have absolutely fucking had Python's current market share, and I am 1000% convinced that if it did, everyone would be happier. Whenever Elixir tries to eat Python's lunch (like in the ML space which it's doing now), a part of me is as glad as a puppy. The language, despite its ubiquity, absolutely sucks in my mind, is not only a terrible introduction to programming for newbs but also likely an annoying language to work in, and the people who don't see it HAVE to be blind. That is the only conclusion I can come to, and I'm entitled to my opinion, strong as it is.

Next you're going to say that it is unprofessional to have such strong feelings about code and languages. To that I say: So what, dude. I care. Caring means having very positive feelings about the design decisions behind languages, or very negative feelings about the same. (I don't like Go, either. Python's basically a step above PHP on the "tech I want to minimize my time to zero with" department.)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: