Asking sincerely: With the various tools like pyenv, poetry, virtualenv, venv and the likes, what seems to be different? What seems to be the best practices today for maintaining projects? I would be grateful for insights.
(Full disclosure: I just use conda in ML & for any lightweight work, I defer to venv which seems to be baked in with Python 3. I have never tried hard to wrangle with this conundrum & many folks are happy with choosing something and sticking to it without asking "why")
- pyenv is used to install multiple versions of python (and even distributions e.g. you can install conda via pyenv), independent of the distribution, it can have a plugin to more easily manage virtualenvs for those but that's optional
- virtualenv and venv are largely the same thing, the latter is a subset of the former merged directly into python, it's used to create isolated sets of installed packages (site-packages), as installing everything globally is messy and inconvenient and can be problematic when two project have different requirementws
- poetry is a packaging and dependency management tool, that is definitely the messiest and most confusing part of the ecosystem as there is a lot of stuff and movement, frankly I don't really get it, and the entire thing annoys me to no end
Personally, I find Poetry really nice. It abstracts away pip and venv, but in a way that you can still understand what's going on and jump in if you need. Once you understand that it doesn't do much else, it is easy.
Can you please explain the pyenv usecase a bit more? Let us say, I needed Py3.6 for something - I can just use virtualenv with appropriate flags to install 3.6 tooling. How does PyEnv help further.
I have read often pyenv helps maintain different version of python on same system. But if that is also possible by virtualenv/venv what am I missing?
You're missing that venv and virtualenv can't do that: they can only create environments for your current Python interpreter, not arbitrary Python versions.
There's no flag that I'm aware of that makes venv install a different Python version than the one it was invoked with.
The --python flag for virtualenv will let you specify a different Python executable to use, but requires that executable to already be installed somewhere on the system.
Not really. Not under OSX, brew makes this simple. Under Linux, worst case, you just download the source and compile it. (Not sure about Windows though.)
> I can just use virtualenv with appropriate flags to install 3.6 tooling.
No. I have no idea where you got that idea.
> But if that is also possible by virtualenv/venv what am I missing?
That it's not possible? venv is literally just a module of the standard distribution. And while virtualenv can do a few more things (hence why it remains as an independent project), installing new pythons is not something it can do.
Virtualenv and venv can only create environments for previously installed python (especially venv as you create environments via the python you want to create an env for, it doesn't support creating an env on behalf of a different python than the one it's invoked through).
Venv helps maintain separate virtual environments for a given python installation, but it won't actually install python. E.g. if you have Python3.6 installed venv can use it, but you can use PyEnv to install 3.7, 3.8, 3.9, ... Which venv can then use.
Basically Python team sat down to consider how they can make their deployment setup worse, and they came up with the idea to split packaging, dependency management, virtual environments/isolation, and Python version management to each need a different too, and then had different random projects make several tries at of each tool, all at more or less equivalent popularity...
Poetry is basically NPM, Cargo, or Composer, but for Python, and my impression is generally that people who have used those tools see it as incredibly valuable and useful, and people who haven't used those tools don't really understand what the fuss is about. (Disclaimer, I am in the former camp.)
It's a way of wrapping all the different facets of maintaining a package and its dependencies into a single tool, which as much of the details transparently taken care of. So for example, if I clone a new project, and run `poetry install`, then:
1. A new venv is just automatically created, I don't need to think about accidentally clobbering my global Python installation, but I also don't need to think about creating my own venv every time. It just happens automatically.
2. All of the packages I need will be resolved and installed as I expect. Additionally, because there should already be a lock file in the repository, I will be installing the same versions as everyone else working on the same project, meaning we're not going to run into weird dependency issues.
3. I can automatically activate and run the environment with ease to run the test or linting tools that I need.
4. I can create wheels and even distribute them to PyPI or other repositories as necessary.
5. I can update the dependencies in the project based on semver requirements, but also individually, on a case-by-case basis, and go back to previous "known good" easily via git.
But most of all:
6. If I switch to a different Poetry-based project, I can do all the same things in the same way, using exactly the same standardised interface. Which means that I can just jump into new projects, support other open source tools and libraries, join other teams, etc, with a significantly reduced onboarding time. No more funky Makefiles, no more figuring out which tool I'm missing: just use Poetry, and everything is consistent, and it Just Works™.
FWIW, it's not perfect. It's not as powerful as package managers for other languages, so there's a handful of use cases like workspaces for multiple projects that are not well supported. It's also not officially blessed by the Python maintainers in the way that Cargo or NPM are, which means it's always slightly more of a risk, and means people need to install it before they use your project. Also even if it were perfect, brilliantly maintained, and used by nearly everyone, there's still a handful of use cases where it would break down - even with Cargo, some projects still have manual calls to rustc to handle their specific use cases.
To me, right now it's an 80% solution (in the sense that 80% of projects will find 100% of their needs satisfied by it). I think it could become a 95% solution if it came popular enough, and if there were more core Python support. (Or alternatively: an alternative that was officially maintained by Python maintainers like venv or pip could be a 95% solution.)
But second, my remark was about that segment of the ecosystem, not poetry specifically or exclusively: poetry is just one of numerous players in a field that's been in flux for years.
What do you mean no? I genuinely don't understand what you mean by that. The whole point of them is to be a consistent CLI for managing projects, dependencies, and packaging. It's even using the same sort of vocabulary and covered m concepts, right down to the .lock file.
How do you see Poetry operating then, if not as an equivalent to those tools?
After 10 years using all of them, in about 50 companies (as a freelancer) with various constraints, contexts, systems and skillsets involved, here are my definitive conclusions:
- stick to regular python installers from python.org if you can. Don't use homebrew, pyenv, conda, the windows store, asdf, etc. They have subtle and least subtle variations that will bite you.
- always use "-m" when you run a python tool, even in venv. This will let you be explicit about which python you use. On windows, use the "py" command to create the venv. No need for pyenv.
- if you are in linux, research how to install python seriously. If the command is simple, you probably got it wrong. E.g: Ubuntu has 3 packages to install, not one. People giving you a one true way are lying to you. Pyenv is a trap.
- learn to use basic venv and pip. Never do something outside of a venv, even for small scripts. You can have one big venv for all your small scripts. Don't even install black outside of a venv. You can get very far with just that. Forget about pipx.
- if you want to have better tooling, use pip-tools. Not pipenv. Not poetry. Not flint. Not pdm. Not anything else. They will fail you. Pip-tools is just good enough and just reliable enough. Embrace manual venv handling. It's not a big deal. Automated tools will betray you.
- learn to be comfortable with the PATH and the PYTHONPATH. No matter what you do, something may need tinkering.
- tox is indeed the best tool to test several envs, but people that need it are rare. Don't bother for now.
- if you want to build a package, use hatch. But you probably don't.
- if you have to use conda, don't use pip or venv at all. Don't use piptool, poetry or anything. Stay on pure conda land. It sucks, but the alternative of mixing them is worse.
>if you are in linux, research how to install python seriously.
I.e, build from source, after installing necessary dependencies (https://devguide.python.org/getting-started/setup-building/#...). Use sudo make altinstall to install the specific version. Usually distro provided ones are pretty old anyways, so you want the faster 3.11 and on.
>Never do something outside of a venv. Not even install black. You can get very far with just that.
If you want to just write scripts to test things, working outside venv is fine as long as you manually install pip using get-pip.py, then invoke pip using python3.xx -m pip (make a bash alias for this command, and then delete the binaries in ~/.local/bin). This will install all the binaries to ~/.local/lib/python3.xx/site-packages, which keeps everything isolated.
>if you want to have better tooling, use pip-tools. Not pipenv. Not poetry. Not anything else.
Also, stick to setup.py for module installation definition. It keeps everything in one place and allows you to run custom workflows. For example, if you wanna run flake8 on install to check your code, with pyproject.toml, it requires pyproject.toml and setup.cfg. With setup.py, you can do this all in one file. Not only that, it gives you extra options to run certain commands.
>if you want to build a package, use hatch.
setuptools is fine for building a wheel - all you are doing is moving files around and zipping them.
As far as OS goes, if you are on Windows, use WSL2 and run python there. Don't even bother with Windows python. If you are on mac, use `asdf` for python installation.
I would advice against all of those recommandations. It would be very long to explain point by point, but in short, my experience is that if someone is asking OP's question, they don't have the experience to deal with the modes of failure of what you are proposing
There is not a lot of things I'm very adament about.
Yet when it comes to python env setup, I have a strong opinion on the matter, built from seing hundreds of people failing at it in a thousand ways.
Looking for the best solution will not work.
You have to go to the minimal one that does the job, which have the least possible ways to fail. And still work in 5 years.
Most people like you, giving advices on HN, know too much about it to realize you solve lots of problems without even thinking when you perform your workflow.
But the average python user don't have this knowledge. They just want to get things done and have something that works.
I am very confident in these claims, although a HN comment is not the best format to demonstrate it.
I should write an article on it honestly at this point because so many devs are suffering for this.
A new team has been put in place to work on packaging officially in python. But the fruits will take years to mature.
The first one, build from sources, is incredibly easy to get wrong.
You may end up with an incomplete python without proper tkinter, ssl or zip support. Sometimes it will look like it works, only to fail later down the line when you use some part of openssl that requests don't exercice often or when you install numpy/scipy wheels. Not to mention someone will do "make install" instead of "altinstall" and break their machine with a 3 letters mistake since apt or rpm python will get erased.
If you wonder how to setup a python venv, you should not compile anything (so no oyenv neiter). In the same way that if you wonder what oil to put in your car, you should not attempt to fix your breaks.
The fact you suggested it underlines my point that you are too experienced with the stack to give advices that would be useful for most confused devs in python. A C# dev that starts python just wants, and needs, one reliable happy path.
In fact, it's part of the problem. Paradox of choice, with too many solutions that won't fit the average user context.
Funnily, now that I get close to 20 years of python, I find myself going back to the beginner stuff.
What's with the push to encourage people not to learn? Get em started, let em figure the rest out as they run into it. You aren't gonna be here forever.
The documentation doesn't, and cannot, mention how to install said dependencies for each distro and version. It is non trivial, and hand waving that is like the "draw the rest of the owl" meme.
Even if the 10 most popular distros were documented, with all their supported versions, and if it covered windows and mac versions as well, and dealt with people in corporation not having admin rights on their machine, you'd still have so many steps a human error will be more likely to be inserted than just use what I suggested.
Again showing that you are so good at dealing with complexity you are not even seing it.
The only way I could convince you at this point would be to make you train 10 different teams, of 10 different people in 10 different contexts and give you one million of dollar if after one year you come back to see that their department has this stuff streamlined.
But I can't do that so that will have to be the end of this thread.
Similar experience, but disagree with almost everything you have said.
- stick to regular python installers. Don't use homebrew, pyenv, conda, the windows store, etc.
For me pyenv has been a lifesaver, easier to manage multiple versions. Are you manually managing multiple python versions?
- always use "-m" when you run a python tool, even in venv. This will let you be explicit about which python you use. On windows, use the "py" command to create the venv.
Unsure how I feel about this, I do normally point direct to the binary being used rather than using "-m", Although "-m" still requires you to be in the correct venv. How does it allow you to be specific about the python you use though?
- if you are in linux, research how to install python seriously. If the command is simple, you probably got it wrong. E.g: Ubuntu has 3 packages to install, not one. People giving you a one true way are lying to you. Pyenv is a trap.
poetry has been another lifesaver that removes the boring writing of a setup.{py,cfg}. Pretty much everything can be managed through it, and it feels like a legit build tool for python. No more broken requirements because of a differing versions in transitive dependencies.
- learn to use basic venv and pip. Never do something outside of a venv. Not even install black. You can get very far with just that.
Can't disagree with this, although poetry will manage your venvs for you. Understanding venv and pip is useful, but not fundamental. Greatest success for devs has been getting them away from manually managing things, by and large perm staff don't care about learning lots of tools well, it's just a job.
- if you want to have better tooling, use pip-tools. Not pipenv. Not poetry. Not flint. Not pdm. Not anything else. They will fail you. Pip-tools is just good enough and just reliable enough. Embrace manual venv handling. It's not a big deal. Automated tools will betray you.
lumping poetry in with pip-tools is weird. it's not a replacement, it's a tool with greater scope than pip-tools. pip-tools provides a subset of poetry functionality i.e. dependency management.
Manual handling is useful to learn, but is laborious after a while, hence tools such as poetry.
- learn to be comfortable with the PATH and the PYTHONPATH. No matter what you do, something may need tinkering.
This is a really interesting point, normally modifying PYTHONPATH is a smell in my experience, but yes is good to understand. Understanding the PATH in general is more useful as a dev than specific to python imo.
- tox is indeed the best tool to test several envs, but people that need it are rare. Don't bother for now.
Agreed, its a nice thing to do, but rarely do I see it unless you are building libs, and need to support multiple versions.
- if you want to build a package, use hatch. But you probably don't.
Why would you use hatch over a tool such as poetry, are they equivalent? I'm interested in exploring hatch, but the comparisons I have found seem to be fundamentally misinformed about poetry so its hard to take comparison seriously.
- if you have to use conda, don't use pip or venv at all. Don't use piptool, poetry or anything. Stay on pure conda land. It sucks, but the alternative of mixing them is worse.
This is a really good point, but is always a frustrating thing to try and solve time and time again, but still encounter problems.
If you recommend pyenv, it's game over, I will never been able to convince you otherwise.
Because it's a sure sign that you have not had to help a team of geographers in a school or quants in banks or small kids.
Your method will only work for a very small subset of python users, and you don't have met enough of them to even know what error they encounter.
It's ok, you are a skilled dev, you offer skilled devs advices.
But no, pyenv has too many ways to fail for the average python users. Espacially since the "py" command on windows and suffixed python on mac and linux works fine to deal with multiple versions.
If would be too long to debate each point so I won't. But I don't make those claims out of the will to be right or contradict people.
I worked with python since 2.4, in dozens in countries. In africa. In asia. In the silicon valley.
I recommend pyenv, pyenv-virtualenv, and poetry for Mac development these days. It’s a nice modern experience.
If you don’t care about different Python versions then the built in “python3 -m venv” is fine and you can drop pyenv. You can use Poetry to make and manage virtual envs too.
If you don’t need lock files (say you are doing ML or solo scripting) then pip with requirements.txt is fine. Lock files start to become useful when you are collaborating with a team so you know you have exactly the same deps (so theoretically the same behavior).
Venv isn't built into Python. You are using Python distributions intended for developers, that's why you will usually have it. Python can be configured to create a distribution without it (which is what typically happen on Linux, where Python is part of the distribution).
Pip with requirements.txt is never fine. It's always the wrong answer to any problem you may think you are solving with it. The problem is it's a very commonly given wrong answer, so, people are used to it.
The reason why it's the wrong answer is this:
You intend for your project to be deployed somehow somewhere. So, you do need a process to find the pieces of the project you are working on to put them where you are deploying them, be it a Docker image, a Python package, an RPM etc. But, pip + requirements.txt doesn't produce any of that. At best, you may hope to have it as an intermediary step in your process.
Now, a very important piece of your deployment is the code you wrote. That's something that's not in the requirements.txt. You sort-of have "pip install -e ." to do this, but it's awfully bad for this purpose because instead of actually installing anything, it's a wrapper around "setup.py develop", which, in turn doesn't install anything, but modifies Python's path where it searches for installed packages. This has a lot of bad / unintended consequences. For example, if you accidentally introduced a dependency on an artifact on your filesystem, it may or may not translate properly into your deployment environment. This is especially bad when you need to use Python code linked with some native libraries, but on your system and on deployment system they are found in different places. It may result in you confusing the libraries (and versions) you think you are using with those actually being used.
In order to avoid all of that, a better way is to create a distributable Python package (i.e. Wheel) and install that, but Wheel has its own way of specifying dependencies, which has nothing to do with requirements.txt. So, you either just don't need that, or try to combine the two somehow...
----
Bottom line: I package couple dozens Python packages for internal use. Those I have full control over don't have requirements.txt and don't use pip at all. They are easy to install and to test, portable between different environments etc. There are teams which produce their own Python packages which I have to package, and in many cases I wasn't able to convince them to abandon the bad (but popular) practices s.a. using pip or / and requirement.txt. This is a constant source of problems when it comes to dependency management. But, authors are convinced they are doing it right...
It’s opposite. Venv is built into python and part of standard library. It is part of cpython repo and developed by python maintainers. Some Linux distributions remove it from standard library but windows/Mac always has it. The Linux distributions that have python without venv are considered incomplete python install and packaging community has considered not supporting that case in future. At moment pip tries to allow that case but in future may just require venv and consider Linux distributions that remove it as shipping a problematic python.
> It’s opposite. Venv is built into python and part of standard library.
Well, you simply don't know your stuff, but are overcome with confidence... where have I seen this before?
Let's start with this: standard library (without quotes) means there's a standard that describes a collection of library functions that constitute a standard library. Python doesn't have a standard. Well, technically, there isn't really Python, there are Pythons. Each accepted PEP that deals with the language creates new language, they are just similar enough that they all, collectively, can be referred to as "Python" for many practical purposes.
Python is notorious for its vague and incoherent use of nomenclature. It doesn't follow its own definitions, but it also doesn't create good definitions to begin with. Someone in Python's history decided to call a chunk of Python's token implementation "the standard library", and the term stuck. But, in reality, nothing makes code found in CPython repository a standard, nor is it even in any way prescriptive for other implementations.
There's a bunch of other code that's optionally included or enabled in the "standard library", and there's no formal document that describes what should be included. In particular, there are a bunch of features that are only enabled by compilation switches (eg. various garbage collection helpers). And you cannot possibly argue that a mere presence of those features in CPython's source code somehow makes them necessarily a part of the "standard library", because then the very definition of "standard library" becomes contingent on the compiler options you provided when compiling Python.
Now, whatever pip authors think is "standard library" or "complete" or "incomplete" Python distribution shouldn't bother anyone really. Their opinion is just as valid as mine or that of my cat: none of us defines nor controls what any of those things are.
Subsequently: who told you that there aren't any Windows or Mac distributions that don't have venv? I can easily make however many I want. But, most importantly, again, who cares? Why does it even matter? To show that venv isn't a necessary part of something, it's enough to find a single example where it's not included, you don't need to search for places where it's included...
Python documentation defines standard library and includes venv, https://docs.python.org/3/library/venv.html, as part of it. There is also python steering council/core developer group that has had discussions with debian/other distributions about this issue and considered those versions of python incomplete. The docs.python.org is pretty much formal document comparable to how c++ has cppreference for defining standard library.
It matters in sense that packaging/python tooling may in future treat your distribution that's not included as not supported use case and just reject it. Distribution made by vendor can do whatever it wants, but I do consider calling it python implementation problematic at that point. If vendor can change implementation of python however they like and remove large subsets of it, then name has little meaning.
There's so much wishful thinking in what you write...
No, and no, and no.
Nobody cares what Python documentation defines: it's not a standard. Python doesn't have a standard. It's convenient to not have a standard, especially if you don't know what you are doing (and I don't use this pejoratively). This makes things more flexible. This, particularly, benefits you if you only ever have one implementation, because it creates a lot of room for experiments.
This, of course, hampers community contribution. If you don't have standards, you cannot have competing compilers or interpreters: there will always be a flagship and those trailing far, far behind. Standards also ensure that come what may, in the future, you will be able to reproduce you work, if you followed the standard. Otherwise, well, you have Python. Today, you can no longer build Python 3.6 on a modern Linux (I might be wrong, and it's 3.5, don't take my word for it, but it's surprisingly recent), and the current version of Python will not run code written for 3.6. So, you end up with useless code. And, since we are talking about Python, the volume of useless code is astounding.
Compare this to, eg. Linux kernel which is written in C89. Python didn't even exist when this standard was created. Consider what will happen if you try to run code written for Python 2.0 (I think this was the first public release?) today? -- This is how you know the difference between having a standard and not having it.
----
PS. And of course, you chose to link to something irrelevant. However, this is what you "forgot" to link to:
> The Python installers for the Windows platform usually include the entire standard library and often also include many additional components. For Unix-like operating systems Python is normally provided as a collection of packages, so it may be necessary to use the packaging tools provided with the operating system to obtain some or all of the optional components.
I couldn't find where Python documentation defines what's "optional" in its "standard" library. So, there's no reason to believe that "venv" is in "standard" library or not. Even the authors aren't sure, but, ironically, the authors don't have the authority to tell one way or the other.
Your demeanor here is quite argumentative and condescending; I'd suggest you (re?)read the HN guidelines. If you want to vent a diatribe to feel better, please do it to your rubber duck instead of here.
```
Be kind. Don't be snarky. Converse curiously; don't cross-examine. Edit out swipes.
Please don't fulminate. Please don't sneer, including at the rest of the community.
Comments should get more thoughtful and substantive, not less, as a topic gets more divisive.
When disagreeing, please reply to the argument instead of calling names. "That is idiotic; 1 + 1 is 2, not 3" can be shortened to "1 + 1 is 2, not 3."
I've been using miniconda as well recently to beat some sense into python. Conda is nice for local development. And then I package it up using docker. Dockerized development environments are of course nice as well and I plan to do more with those. Conda seems a bit heavy weight in comparison. It seems you end up with all these layers of half-assed virtualization with python these days. So, I used conda install python and pipenv. And then you pipenv shell and install some dependencies. Seems convoluted to me but it works.
Of course none of this is relevant when you use docker. Kind of the whole point of docker. Isolate everything at the file system level and it stops being a problem that python wants to barf files all over it and make an unholy mess of things. The proper fix would be to not do that and be more like go, rust, and others and apply some hygiene with respect to where things go (i.e. not in some global directory). Would that be such a massive change for python to not do that?
I had the same question. I also just stick with venv with diff py versions. What does pyenv give that venv doesn't? Similarly when you are used to a venv + pip why poetry?
Oh in all this I now see nix nudging it's way in and I am feeling a bit anxious I a missing out on something!
If you use conda, you are in a different world, so, none of what you listed above is of any use to you. But, say, you wanted to go from Anaconda Python to what I call PyPI Python, then there are two distinct ways.
Way 1: You prefer consensus over quality
Python infrastructure tooling is god awful. It's so awful it's on the level of its own. You just don't get things so bad from other popular tech. Python also has a community that's driven by personalities and dogma, because most of Python community is very bad at programming, so, instead of deciding for themselves the community is dedicated to following "best practices" which they don't understand, but which guarantee that a randomly selected person from the same community will be able to use your project.
If this is your case, then strike out poetry and virtualenv from your list, and that's the typical setup. For people who are afraid of compiling Python pyenv provides a workable solution, while venv is the most popular tool for creation of virtual environments mostly because it's bundled with Python binary distributions (or those built by pyenv). Pyenv has its own virtual environments, but there's no reason to use those.
Poetry is a project that is mostly dedicated to installing Wheels, which is Python packages. It also has its own virtual environments. The problem with Poetry is that it can only deal with a subset of all packages that may be installed using other tools, i.e. pip and setuptools. It's somewhat better organized, but because it's a non-solution for a large fraction of problems in its domain, it will only work for you if you can make sure you will never need anything from that fraction.
Virtualenv is just a Python 2.X thing. You are unlikely to need it today. Basically, it does what venv does, however venv doesn't support Python 2.X.
Way 2: You want it to work well
I have bad news for you... that's not going to happen. Yet, you can improve your quality of life by ditching pyenv. There's no reason to use it, if you aren't afraid of building Python from source. That's how I always do it, and that's what allows me to make debug builds, if I need those, or apply patches if I need them, or simply build slimmer Python distributions, if I don't need some features (s.a. tkinter, or image processing libraries etc.)
----
NB. venv isn't baked into Python 3.X. You may choose to build a distribution w/o it (which is often the case for Python installed with Linux distros), but it's common in distributions intended for developers.
venv is a bundled virtualenv subset. It came after it. Now you don't need to install virtualenv since venv comes bundled with python. Don't concern yourself.
pyenv, poetry are wrappers and do different things.
You would use pyenv to create a virtual environment. Or venv directly. Or the older virtualenv tool. Stick to pyenv as it is easiest.
You can setup your shell to activate/deactivate automagically for you when you cd into your project directory using something like: https://github.com/direnv/direnv
I would avoid poetry.
Edit: bah, somebody answered already
tl;dr - Setup direnv and pyenv once and foggetaboutit.
I have read on occasion that pyenv seems to be woefully unmaintained. Is that true? Is it a good move to learn using poetry instead.
Another observation I have is that although pip installs in environment are sanitized, they share a global cache (or tarball dump under .cache/ e.g). Is there any way, pip packages can be further compartmentalized? Is there any merit to pipx that I hear mentioned at times. Thank you for answering
pipx is for installing python scripts as cli tools, as an end user.
Say there is a thing called ytb-dlp that lets you download YouTube videos. Instead of creating a virtualenv for it yourself, activating it, pip installing this thing inside it, and then having to cumbersomely use it like that..
pipx install ytb-dlp. It will do all of that for you and put it on your path.
> I have read on occasion that pyenv seems to be woefully unmaintained. If that is true, is it a good move to learn using poetry instead.
> Because... you want more network traffic for no reason?
I don't intend separate copies. But having some kind of compartmentalized structure to avoid possible package related issues? For e.g if I had a old defective version or with vulnerability, I don't want it to be broken across any & all workspace re-using it.
Thanks for your comment (despite the irritation that I can guess. I presume you expect everyone to understand what you already know in a single stroke. But that seldom happens in real world)
To someone unfamiliar with many of these Python configurators - it definitely helps. Thank you for answering.
This is particularly nice because your build tool is also your test/script runner, meaning you have one less dependency to manage. The Pyenv technique described here should work as well with Hatch as with Nox.
Is there a good explanation as to why python-level tools like nox and poetry seem to avoid integrating with pyenv?
Pyenv seems to be the defacto python version manager, and Pipenv seemed to integrate well with it. Why has the python tooling community moved away from it?
pyenv is really slow. It'll increase the time it takes to run `python` by a ton. For me it makes `python -V` take ~120ms but without pyenv it only takes ~13ms.
This is amazing! Thank you for this wonderful project. It looks like you've taken care many issues I've experienced with asdf (startup performance, shims, bad/confusing commands, etc.). It looks like I might not need to add anything to .*rc files if rtx is in my path? Either way, I can't wait to try it!
> I might not need to add anything to .*rc files if rtx is in my path
Generally speaking you'll probably want to use `rtx activate` in your shell rc. However you could use the shims instead if you just wanted to add a directory to PATH instead.
Of course you'd be getting rid of one of the major selling points of rtx by using shims, but it's so much faster than asdf the shims aren't as bad as they are in asdf.
I don't think it's for any particular reason, other than that nobody has done it yet. I would imagine that the response of most projects would be "PRs accepted", unless they consider it explicitly out of scope.
I could see some projects not being interested in automatically installing versions using Pyenv. But I think most projects wouldn't mind the option to defer to Pyenv before searching PATH, or to otherwise configure a custom Python search path that could be derived from Pyenv and/or other sources.
Note that Pyenv is itself basically a pile of shell scripts, and it doesn't work on Windows (although there is a Windows port of it). There might also be a general expectation that this functionality is too complicated to support in a way that makes everyone happy, and that users need to figure it out on their own, e.g. using the technique described in the article here.
You can also go the other way and use the Pyenv-Virtualenv plugin.
Python never followed that or any other absolutist claim.
Python is a huge ball of yarn comprised of donations of, perhaps hundreds individual developers w/o strong ties to each other. If you look inside the project, you find plenty of distinct styles and approaches coexisting and overlapping. The moderation is very weak / non-existent.
Installation utilities, however, were in Python since very early on. "distutil" dates back to, probably, one of the first public releases. It's since then was always part of the "standard" library, but it's going to be removed in the name of chaos, anarchy and entropy in one of the upcoming releases.
There was never any great design in Python in general, or in distutil in particular, but through a series of bad decisions, which were quickly carved in stone, we ended up with the system we have today. Setuptools was a first community project which grew out of recognition of inadequacy of distutil, but, instead of trying to replace it, it was designed to be a drop-in replacement in terms of interface and was made to depend on distutil in order to do anything useful.
This result in setuptools being a lot more convoluted and rigid than desired. Few people wanted to participate in this project, and so it was proclaimed dead several times, setuptools2 was supposed to replace it etc... but the decisions carved in stone wouldn't allow anyone to really dislodge it.
Gradually setuptools grew more hair, another pair of horns and wings by merging with distribution, easy_install, resouces_pkg and a bunch of other smaller evils, making itself ever more indispensable. Somewhat in parallel to that process, pip was starting to pick up speed. Pip promised a lot, but never delivered on those promises (eg. setuptools was never able to uninstall packages, while pip has "uninstall" command, it's broken by design).
And then anyone who made any sense abandoned Python. Python community was left with Python Foundation people or PyPA in this instance, who are not competent to deal with packaging. The entire activity today around packaging in Python may be described as chaos. It's adding random features based on the assumption that existing features work when they don't. Trying to get rid of the only useful tools while replacing them with tools that rely on the tools being replaced etc. It reads like some sort of an absurdist novel if you have to dive deep into it. And yet, it's a technology used by millions, shuffling trillions of goods and services around the world...
And... you think that by writing how good you are at something you actually become good at that thing? Like, I mean, you think that if you write "I, drewcoo, can run 100 meters faster than Usain Bolt" you will indeed run faster than Usain Bolt?
What that document confirms is that at one point, when Python was cool, and people liked to talk about Zen Buddhism (probably inspired by Hofstadter's book), someone, semi-jokingly wrote that PEP. And this is all there is to it... You cannot be possibly delusional enough to think that eg. "explicit is better than implicit" is somehow provable or enforceable or can even be properly defined... Like, a lot of Python programmers would agree that context managers do implicit state management or that, in general, decorators create implicit behaviors attached to code of the functions or classes they decorate... and yet Zen of Python PEP still exists, right? So, what makes you think it's some sort of a rule that's never broken by the language it's allegedly applied to?
There's no reason to use pyenv, pipenv or nox at all, ever. Whether they want to cooperate with each other, or stand together and cry in a corner -- the community shouldn't care. The community should recognize how little value these tools offer (more like negative value, really), and demand from the people responsible for this part of tooling (i.e. PyPA) to get off their rear and make themselves useful, or step down and clear the way for people who know how to do this stuff.
The proliferation of useless tools that advertise themselves as solutions to unnecessary problems created by the core of Python development simply shouldn't be happening. If Python community wants to overcome the ridiculous state it's in right now, it needs to concentrate on fixing the problems at their core, not on creating another layer of duct tape around the rotten core.
`pyenv` was recommended to me as a way to decouple my Python from the OS's one, in order to be able to run a more current one.
But it has been irritating me more that benefiting me, because apparently I need to recompile Python on every new patch release, also for ARM on a RasPi, which takes forever, and then manage the virtual environments related to it. I can't see it as a solution to my problem.
My problem was that I was using `virtualenv` and when I upgraded Ubuntu 20.04 to 22.04 it blew up all my projects until I recreated all the virtual environments. And I have a lot of them, at least three dozens dispersed on around 15 Linux installations. So I need to find a way to either migrate everything automatically via a script, or use something like `pyenv` to lock my projects to a specific version, so currently I'm holding back a couple of 20.04 installations.
I don't know what to do, this kind of exploded on me, I should have seen it coming.
The problem pyenv solves is that of programmer's competency. If you are not afraid of building Python, you have no use for pyenv. Python is very easy to compile, it doesn't have a lot of dependencies that need special setup.
As for your question about environments: if you build and install Python yourself, you will most likely use "make altinstall" to do that. This installs Python as /usr/local/bin/python3.X. Then, when you create virtual environment by running "python3.X -m venv" it will always find your Python and will not require any changes if you decide to also compile a different version of Python (because that will end up being python3.Y).
Most likely, what happened is that you ran "python -m venv" and it used system's Python to create it. Once you upgraded 20.04 -> 22.04, you got new system Python, and most of the packages in your virtual environment are now incompatible. You may similarly handicap yourself if you use pyenv: it will just add another level if indirection, but will use unversioned Python executable to create virtual environment.
If you want to salvage those environments, then find out what version of Python you used on 20.04. Compile that, use it to create a blank virtual environment. Copy the old virtual environment directories "lib" and "include" into the new environment, also copy files from the old "bin" directory into the new one, if they aren't already there.
> My problem was that I was using `virtualenv` and when I upgraded Ubuntu 20.04 to 22.04 it blew up all my projects until I recreated all the virtual environments. And I have a lot of them, at least three dozens dispersed on around 15 Linux installations.
Using Nix instead of virtualenv and pyenv might be worth it for you here, though migrating each thing over will be some work.
Yes, and unless those bugs affect your development in your dev machine you can live without them and only update whenever you feel like it (regarding security, if someone remotely can run their Python code on your dev machine, a security update for your Python version is the last thing that will help).
Or do you mean you use pyenv to handle multiple Python versions in production servers?
Not on production, but as a publicly accessible, nginx-proxied aiohttp server running on a Raspberry Pi. For personal use, like having a long-lived WebSocket connection to my home server.
I generally thought it was best practice to always update to the latest patch release. I mean, that's what `apt` is doing for the system-installed version.
>I generally thought it was best practice to always update to the latest patch release. I mean, that's what `apt` is doing for the system-installed version.
Well, if it's publicly accessible, yes. Though even there you don't just "update to the latest patch release" you need to check it first (at least with your test suite), it might break your functionality and even introduce security issues due to that, regardless of what the upstream "semantic" versioning implies.
I also think that would be a good addition, although most of the time I would probably still prefer to run sessions in serial, and parallelize tests within each session using pytest-xdist. That gives you fairly fine-grained control over how tests are distributed among workers, grouped either by test function, class, module, file, or arbitrary groups using a decorator: https://pytest-xdist.readthedocs.io/en/latest/distribution.h...
You're not wrong about the proliferation of tools, but I'm hopeful it's a necessary waypoint to consolidate on fewer tools.
Take dependency managers; for a long time there was just virtualenv+setuptools, then came Conda, then Pipenv. That was it for awhile, until Pipenv got displaced by Poetry. Now we have at least three others vying for mindshare: hatch, PDM, and Pyflow. They all have a ton of overlap with Poetry, but they're all a bit different.
I think venv/Conda/Pipenv was a tolerable set: barebones, heavyweight for ML, and a tool that feels like what other langs have. I don't think we can handle 4 Pipenv replacements for long, so I hope we see some of these tools replace eachother in the coming years. I'm not sure if the standardization of pyproject.toml encourages or discourages this proliferation, but I think it's a huge win for the community vs setup.py.
I feel like a boomer just compiling Python from source and using venv/pip/setuptools/wheel until all the stupidity blows over.
I wasn't even aware of hatch, PDM and Pyflow. Every new tool is just trolling the Python community. I even submitted a couple pip patches, never again.
(Full disclosure: I just use conda in ML & for any lightweight work, I defer to venv which seems to be baked in with Python 3. I have never tried hard to wrangle with this conundrum & many folks are happy with choosing something and sticking to it without asking "why")