Hacker News new | past | comments | ask | show | jobs | submit login
What the Heck Is Pyproject.toml? (snarky.ca)
133 points by BerislavLopac 56 days ago | hide | past | web | favorite | 194 comments



Python desperately needs a better packaging solution than pip from Python Software Foundation. Stop all new features until this - the biggest pain point for Python across all expertise levels from newbies to experienced Python programmers - is officially solved. No pipenv, no poetry, no conda, etc from third party devs who sometimes get tired of the pressure [1]. It is immense. Combine all this virtualenv stuff with great dependency management into one tool. In the spirit of Python, there should be only one way to package, manage dependencies and virtual environments. It should be super clean (see Go).

Python needs a BDFL to enforce the hive mind. It is a fucking mess and it's unacceptable. Python is otherwise the most incredible language of our times. It is so far ahead of anything out there for general purpose use.

Relevant xkcd: https://xkcd.com/1987/

[1] https://github.com/pypa/pipenv/issues/4058


Do note that Python the language is developed by a separate group as Python the packaging ecosystem. So even when Guido was a BDFL he completely stayed out of the packaging situation.

And there is a packaging BDFL, but there's a severe lack of time and effort in the form of volunteers to tackle a lot of the projects, big or small. Plus rolling these things out takes years and it takes even longer to gain traction as people are often reluctant to change their build tools once things work.

As an example of timelines, take what my blog post covers. PEP 518 and the concept of pyproject.toml was conceived in 2016, four years ago. Getting PEP 517 finalized and it all worked into pip didn't happen until pip 19 in January 2019, over a year ago (https://pip.pypa.io/en/stable/news/#id203). And a year later there's still enough of a need to educate folks that I wrote this blog post to help get the word out.

IOW the timelines are huge and the pay-off is way down the line. And that's if you manage to be motivated enough to do the work. Burn-out is a huge problem in the packaging world. I mean reading that you think "it is a fucking mess and it's unacceptable" doesn't motivate me and others to keep trying to improve things if that's how us volunteers will be treated for years to come while we try to rectify the situation. Add on to the fact that we aren't getting paid for this and it makes the idea of pulling out my laptop tonight to work on some packaging stuff feel rather pointless.

So please be understanding that people are working on things as best they can and that overt negativity about it all will work against you getting what you want.


Definitely, it is always good to keep in the back of our minds that FOSS is developed by people in their free time and on a volunteering basis. No one owes us anything. No one should feel entitled. I am sorry if my complains come out as such. Python runs the world and it is because of the contributions of many amazing people.


> Definitely, it is always good to keep in the back of our minds that FOSS is developed by people in their free time and on a volunteering basis.

Please don't over-generalize. Volunteering is not the only way FOSS is developed. E.g. the Linux kernel, Gitlab, Firefox, Golang main tools…


This is BS. most other languages, with the same constraints, have done far better. Take Ruby as one example.


Ruby serves one ecosystem (web), barely.

Python serves several huge ones: web, datascience, 3d graphics, sysadmin... all with different needs, different conventions, different “standard” tools to do this and that. The community is orders of magnitude bigger than ruby by now, and anything that requires coordination is much harder to accomplish than in a lot of other ecosystems.

Python is now on par with ecosystems like C/C++ (where different build and packaging conventions are legions) or Java (where ever 10 years they write a new build tool on top of the previous one(s) to hide the complexity, and still, like python, they struggle to make simple executables 30 years on...). It’s just a hard problem to cater for so many different needs.


That’s the “oh we can’t use methods from others, we are too special” argument, which is always BS. Coupling the misapprehension to a snooty misconception is doubly unfortunate.


> Python serves several huge ones: web, datascience, 3d graphics, sysadmin... all with different needs, different conventions, different “standard” tools to do this and that.

Them having different conventions and standard tools is the failure, not a constraint.

What are the different needs that these ecosystems have which mean they couldn't use a common package manager? I don't believe there are any.


For instance, in the scientific computing world, compiled extensions are hugely important, whereas they're relatively rare in web development.

There's probably no logical reason why a single common package manager for these domains is impossible. But they have different priorities, and people write tools that solve their own problems. Who is going to decide which package managers are unnecessary, and convince all of the users who are happy with them to switch to something else?


NPM has excellent support for compiled extensions, despite being mostly used for web development.


I think something people are forgetting when comparing Python's packaging system to other languages and runtimes is how old Python and its ecosystem is. For instance, Python's public release predates Linux's public release(Feb 1991 versus Aug 1991). The first release of setuptools was 2004 and it isn't like people weren't sharing Python code even before then (https://github.com/pypa/setuptools/commit/8423e1ed14ac1691c2...). That's over 5 years before npm was released (https://en.wikipedia.org/wiki/Npm_(software)). That means npm got to learn from Python and any other project that predated it. It also means they didn't have Python's momentum to contend with (I certainly never expected Python to become _this_ popular, and that makes it very hard to change as you literally have tens of millions of people to re-educate when you change something).

So while it's fine to say you wished Python's packaging approach was as good as another languages, please do so know that it is not fair to say that if one community got something to work for their needs it means Python should have been able to as well; it's not an apples-to-apples comparison considering the trees were quite possibly planted at different times and are different varietals.


The age also means Python has had plenty of time to come up with something new!


I'm not sure how old you are, but if you don't know the name "Borland" you might not realize how novel it is to have a compiler on every major OS. While Python's age means it has been around to see a lot of changes occur, that also means it had pre-existing practices and code to also consider whenever a shift in the industry occurred. Plus in hindsight things look obvious, but during the transition it isn't obvious what will stick and what won't.

It's pretty remarkable that things have continued to function since my 17 years with Python alone have seen things like:

- Compilers becoming available on all OSs - Linux becoming a thing - Laptops becoming common - Smartphones - Web-based email clients

As I have said a few times in comments on this thread, it's fine to feel like there's room for improvement, but trying to "stick it" to Python and its packaging ecosystem in some way for not being perfect is pointless for you and demotivating for those who are actually in a position to try to make things better for you (and even if you aren't a Python user you should care to some extent about it as Instagram runs on Python and thus it is helping to deliver you those cat photos while you're stuck at home ).


I’ve been doing this for over 20 years. It’s no excuse, python should have a solution.


It sounds a bit more understandable if you rephrase it as "a small group of largely unpaid volunteers should find a solution for which millions of users and several billion-dollar businesses will happily abandon their current long-running and well-tested solutions".


First, programmers have choices, and Python's lack of a good user experience on this front will turn off people that have better options with other languages which have found a way to deliver a better experience given the same constraints as python.

Second, there is the python foundation, which funds people to work on areas of python. This should be a priority, and can be done by paid contributors.


People in the Python ecosystem have come up with plenty of new things over the years - eggs, wheels, conda, poetry... Coming up with something new is the easy bit. Getting everyone to agree that a new thing is better and stop using/recommending other things is much harder - almost impossible, if those other things serve some use cases fairly well.

The fragmented ecosystem is one of the main complaints about Python packaging today. And that's really tricky, because the main way the open source world knows to solve problems is to create new tools, and that doesn't help when the problem is 'too many tools'.


And it (and by "it" I mean the Python community) absolutely did! The main problem that it has come up with numerous different somethings that have been competing to become standards, across a wide range of related, but separate areas (packaging, versioning, environments...), and only a few of them (like pip) managed to become partial (but not at all universal) de facto standards.


> relatively rare in web development

not at all. from my previous experience with ruby i was using gems with compiled bindings to libraries for: mysql, memcached, json, xml, regexp and http. and probably some more i don't recall right now.


Python does all of that and more. There are still occasional issues with the thorniest setups, but binary wheels absolutely took care of the average case. You even get a massive desktop toolkit luke QT with a simple “pip install pyqt”.

Getting there might not be the nicest experience, but most packages nowadays can and do get there.


Working in 3D graphics there are lots of different constraints unfortunately and it isn't an easy solution.

A lot of 3d graphics houses have to roll their own solution.

This is a famous one in the cycle and a common approach-> https://github.com/nerdvegas/rez

The issue is that the environment varies a lot and there's usually a lot of C++ code that needs to be compiled to interact with the python plugins based on different environments with 500+ libraries mixing & matching...


Node does it better as well. So does java with maven which has been the dominant way to do it in java for over a decade.


I would not use Node as an example of good packaging. It really hasn't been that long since left-pad. Then there's the fact that you still have multiple different tools to do the same thing (npm vs. yarn) and the end result node_modules/ is a fucking mess anyways.

    find node_modules/ -name "node_modules" | wc -l
         327


I think you can certainly use Node as an example of much better developer experience. Even the difference between npm and yarn is fairly minimal, and there is a solid community 'default' in Yarn.

I think part of the reason we ended up with so many trivial libraries and dependencies in the Node ecosystem was precisely because it is so easy to create and publish packages.

I love Python as a language, but working with the packaging even just to install dependencies is like pulling teeth every time you start a new project - nevermind publishing packages yourself.


No it's not... 99.9% of the time, installing dependencies is a breeze. Packaging is where the issue lies, but pulling those dependencies down is painless. You run one command and everything works. Unlike say npm/node/yarn where 50% of the time after coming back to work on a project your node_modules is broken and you have to either reinstall npm or node or node + npm.


The problem we're talking about isn't "installing a dependency", the problem is creating reproducible builds across people and between dev and production. The secondary problem is managing dependencies over the long run (updating, transitive dependencies, etc). Ruby/Bundler does this so much better than requirements.txt by a long shot.


Node's packaging story sucks in many ways, but IMO python's is worse. I spent quite some time working with python when there was no first-party dependency sandboxing (i.e. there was virtualenv but no venv) and even today there are tools that work with one but not the other.

That said, PEPs 517 and 518 sound like they're going in the right direction from the article.


ied ( http://gugel.io/ied/ ) had a great idea for reducing redundant installations in node_modules/, I wish the project got some traction. It worked as advertised when I used it 4 years ago.


That’s a different issue than the one we’re talking about here.


Maven is “so good” that people wrote another tool on top (Gradle) so they don’t have to touch the Mavenstrosity. More or less like Maven was built to get rid of that other nightmare, Ant.


Maven isn't as good as bundler or npm for sure, but it's better than the mess that is the python communities solution.


left-pad comes to mind when you mention Node.


Yes, Python’s packaging is partly bad because it’s based on C extensions, which are intrinsically bad, but it’s also just plain bad compared to its peers, like Ruby and NPM.


NPM also has support for C extensions, and they mostly work reliably (the only problems I've had are forgetting to `npm rebuild` when switching between node versions, or when switching between native mac builds and linux-in-docker builds.


What exactly is "intrinsically bad" about extensions written in C?


In my personal experience, extensions written in C don't necessarily compile at first attempt because the dependency management there is even worse. May be it fails with a missing openssl or readline library and I have to go find whatever OS specific thingy I need to do to recover.


C/C++ have no equivalent of npm/bundler/pip. Instead there are incompatible flavors of Makefiles for incompatible C compilers. It's a hard problem space to work in.


A lot of ruby is based on C extensions too.


It might be said that python solved a LOT of packaging woes by including so many batteries.

Compared to other languages, python has a pretty rich set of modular tools in the standard library.


I am an experienced Python programmer. I often have to collaborate with people who are not. There is no greater hell than talking them through installing a working Python environment. It’s impossible.

Meanwhile, the pip project is full of developers who tell you that your problem is not a problem and you’re discouraging people by pointing out that they’re blame shifting: https://github.com/pypa/pip/issues/4995

I honestly have zero faith that things will get better without a radical change in leadership. Pyproject.toml is a good step in the right direction, but it’s been around for years. They had time for adding the walrus operator, but not for replacing the setuptools, venv, pip Cerberus.


> There is no greater hell than talking them through installing a working Python environment. It’s impossible.

I give a lot of Python professional trainings, and I get to do that regularly, in very diverse situations. It's indeed full of gotchas.

Since it's not going to be solved quickly, meanwhile, here is what works if you need to help people setuping python:

1 - Install Python correctly

The first version Python download link for windows is 32 bits. You want 64 bits, so you should actually not click on that. Also, make sure they use the latest minor release possible, as early ones can have weird bugs. Tell people to install from the app store if they are on windows 10, or give them a link to the proper (non web) installer.

Linux: python versions may not be available for their linux distro. Use EPEL for Centos or deadsnake for Ubunto. Other Linux users chose something exotic and must be able to deal with it.

Mac: brew is fine. Official installer too.

Cause: Python support is crazy good. It support 32 bits. 3.4 supports Windows XP. 2.7 supported Atari and Solaris! So there are a lot of installers, and a lot of versions. But also because the official Python website does a poor job at directing the user.

2 - Run Python correctly

This is the great lie of Python running. You cannot just use the "python" command, which is what every doc and tutorials tell you to do.

On linux and mac, tell them to use the suffixed PythonX.Y command, with X.Y being the version of python they need. E.G: python3.6.

On windows, tell them to use the "py -X.Y" command. E.G: py -3.6

Tell them anytime they see a tutorial with "python" in it, they should replace it mentally with "PythonX.Y" or "py -X.Y" depending of who you have in front of you.

Cause: people often ends up with have several versions of Python installed on the same machine, so you can't tell them to just use the "python" command. Of course, windows and unix never agreed an on naming convention. What's more, the Windows situation can lead to a PATH problem, and "python" may not be found, while "py" will always be. We are talking about providing the "py" command everywhere.

3 - Install tools correctly

Introduce them to pip. Tell them to never, ever install stuff using admin rights with it, even if told to by documentation or tutorials. No "sudo". No "run console as admin".

If you need to install a tool, such as black, mypy, pylint, etc., outside of a venv, use "--user" to install it for the current user. This requires no admin rights.

Also, don't use the pip command directly. Use "-m" so that you always know from which python you are installing the command for. E.G:

    python3.6 -m pip install black --user # unix
    py -3.6 -m pip install black --user # windows
Then mention that it's only for tools, not libs. Only install libs in venv. Some tools also should just not be installed at the system level such as jupyter or pytest because they depends on their env.

Cause: installing with admin rights can destroy your python installation. Also, pip may not be in the PATH, or be attached to the wrong version of Python. It's also because we are twisting a lib packaging tool into providing programs.

This is not specific to python. Node had to introduce a whole know command, npx, to solve the same problem, and the ones bellow. It actually takes a command name, and if it doesn't exist locally, but is registered on npm, download the package that seems to contain it, install it in a temp folder with all its deps, then immediately attempt to run it in isolation.

4 - Use tools correctly

It's another lie from docs and tutorials. If you install a tool, just calling the command may fail.

If you installed a command outside of a venv like above, then you should call it using "-m". E.G:

python3.6 -m black # or py -3.6 -m black

Cause: again, you don't know how the PATH is setup, so there is no guaranty that the command will be available, or ran from the proper python version. People can have completely messed up machines and there is nothing you can do about it. So don't depend on the PATH.

5 - Use venv correctly

The solution to all those shenanigans are the venv. Once you are in a venv, you don't need to tell the version of python, you don't need "py", "-m" or "--user". You can call commands directly. You can just use "python".

But first, people need venv installed. It's installed by default on Windows and Mac, but on linux, it's often a package to install like python3-venv, python3-pip or python3-setuptool.

Do not "pip install virtualenv": people will get confused between the tools.

So, make them create a venv: "pythonX.Y -m venv name_of_the_venv" (or py -X.Y).

Show them how to use the python/pip from the venv WITHOUT activating it, so that they understand how it works and show them the that unix have ./bin and windows ./Scripts.

Then show activation, and tell them to install pytest/jupyter and Co inside. Show them that it solves all the previous problems, so that they are motivated to use them.

Tell them to NOT put their code in the venv folder. And that they can't move or rename a venv. Show them "pip freeze > requirements.txt" + "pip install -r requirements.txt" as an alternative.

Cause: historically python didn't have the venv module, and people used to pip install virtualenv, which mean we still have docs about it, and the name stuck. Having a long history makes things complicated. And once again, the differences between Windows and Unix bites us, but that's the cost of being portable. It's not Python-specific. The fact you can't move a venv is an artefact of the venv design that the community never solved.

# Variant

People may want or need to use anaconda or the python embedded in a system (blender, qgis, etc). Then it's a whole different cycle of problems and solutions, so I won't talk about it here, this post is long enought.

Again, Python diversity is playing against it: you don't have one popular python distrib like with nodejs or cRuby. You also have commercial Python distribs, and some are used a lot in the corporate world.

Diversity make things more resilient and encourage innovation. But it makes thing harder.

# Pyproject and setup.cfg

They have nothing to do with all that. They are useful if you want to _produce_ a package to share your code with other devs.


That's a pretty solid guide! It's even useful for me that writes Python (non-exclusively) professionally.

My company settled on pipenv, and we regret it. It's slow and buggy and has crazy behaviour. It's not bad enough that it's worth updating all projects (or rather, there's no good enough alternative) but I regret about once a week having advocated for it.

Note: I think you made a mistake, you say that you can rename or move a venv but then you say you can't.


Thanks. Fixed.

Being stuck at home, I currently give training remotely. If you know people that need trained in anything Python (beginner, advanced, scripting, architecture, web, data analysis, etc.), think about me :)


This is an awesome summary that should be placed on the python.org front page instead of all the fuzzy stuff that's there right now.


I've been tasked to make a PR for the pypa website with something like this. If it doesn't get merged, I'll still have a copy of it to some website as a single page. I bought usepyhton.info for this purpose, since pyformat.info did a great job.


Excellent guide, I just found one thing off here:

> If you need to install a tool, such as black, mypy, pylint, etc., outside of a venv, use "--user" to install it for the current user. This requires no admin rights

I think you should add that on Linux distros it's better to use the package manager to install libraries, because a beginner can, and will, have conflicts with (system) packages relying on a particular version of a library. Only use pip if the package is not available, while with the like of Debian, you will have old versions of libraries, I think fast paced distros like Fedora does a good job at providing up to date python libraries with the package manager.


Wow, I am not a python developer and avoid python as much as I can because I hear about problems with versions everywhere. I have no idea which documentation is correct and which is old/wrong. With your manual I might dare to try python. Anyone who says (as seen in this thread) that python is not that complicated to install and use should take a look at this monster of instructions and then shut up.


>that python is not that complicated to install

This is not the instructions to install Python.

Those are the instructions to teach anybody in any OS with any configuration to install python and deal with 3rd party dependencies.

E.G: if you are a Windows 10 user, instructions to install python are 2 lines:

- install from the store

- run "py -X.Z" to start python

The complication comes when you have people with Windows, and Mac, and Linux, and they already have some Python installed in various ways, and then want to install stuff with pip, etc.

Which is true for Ruby, JS, PHP, etc.


this is an awesome summary; I use python occasionally, and have run into several of the issues you flag in your post, but would probably do it all over again next time I need to put something in production. So thank you for taking the time to summarize things so thoughtfully.


Well, with the covid 19 confinement, I'll give training remotely. So if you know companies that need their IT department trained in anything Python (intro, senior, syadmin, API design, flask/django, numpy/pandas, etc.), think about me :)


Your comment was too long for me to read without my eyes glazing over. Maybe it's because I already know this stuff, but I suspect for someone who doesn't know it and just wants to program, it would be intimidating.


This is not the instructions to install Python.

Those are the instructions to teach anybody in any OS with any configuration to install python and deal with 3rd party dependencies.

E.G: if you are a Windows 10 user, instructions are 2 lines:

- install from the store

- run "py -X.Z" to start python

That's it.

The difficulty is to have general instructions for all cases, including packaging. Which is a problem with every scripting languages.


> "I am an experienced Python programmer. I often have to collaborate with people who are not. There is no greater hell than talking them through installing a working Python environment. It’s impossible."

I too regularly have to do the same thing, but I have absolutely no issues getting newcomers onto the python ecosystem. You download the installer, you run it, and it works. Or you apt/yum install it with no issues.

Setting up a dev environment for them though is where some hassle comes. VSCode being a particularly bad one in that regard. But with Pycharm, newcomers are on boarded in a matter of minutes.


I see pip's behaviour being called "bizarre" and "baffling", an expectation that the "fix" should be easy (it never is: you "simply" add a flag, now some code paths bifurcate, now you need more tests, documentation should be updated and reviewed, etc.), no pull request, and the pip developer is still patiently explaining why pip's behaviour is not meeting with expectations. But really, it takes a lot of entitlement to expect really anything whatsoever out of somebody else's free time.


I see a weird, surprising, unhelpful design, and a lot of critical comments being deleted.


Weird, surprising and unhelpful are value judgements on your part. It could be a security feature to someone else!

The hidden (not deleted) comments are hidden not because they contain criticism, but because of an unnecessarily harsh tone, which is unhelpful to anyone wanting to read up on the discussion at hand, and definitely not helpful for the motivation of anybody currently doing the unthankful work of maintaining pip. They do a very difficult job, where they have to deal with 20+ years of legacy, many different operating systems, many different Python distributions, many different skill levels in their user base, and even having to fight OSS fatigue due to unconstructive criticism. Pip is as good as it is due to their hard work, and we should be thankful to them.


> The hidden (not deleted) comments are hidden not because they contain criticism, but because of an unnecessarily harsh tone, which is unhelpful to anyone wanting to read up on the discussion at hand, and definitely not helpful for the motivation of anybody currently doing the unthankful work of maintaining pip.

I reread the thread last night and asked, Was I too harsh? Am I discouraging the thankless pip team? Obviously, I am biased towards myself, but no, I don't think I am the one doing the discouraging. They are the ones doing the discouraging.

Go back through the comments and look at who is PyPA official and who's not: the non-members have ideas and members shoot them down. It's not that complicated technically--there should be a switch called "require hashes" instead of an implicit requirement for hashes once you use one. But the PyPA devs just showed zero interest in understanding the problem or helping. Their first comment was that error message was fine, nothing to fix!!!

I ask myself, would I want to contribute a patch to pip? I contribute patches occasionally to OSS. Not every day, but every couple of months when something comes up that affects me. I would never bother. Why? It's obvious from their tone that the patch would be stuck in a queue and never reviewed. (This happened to a coworker at a former job. We maintained a patched version of pip for years because they wouldn't take our change.)

In contrast, I just submitted a broken patch to Go (I broke tests and said so in my patch) and the Go team took it and fixed the tests. Why did I bother to submit a broken patch to Go? Because it was pretty clear from the tone of the discussion that if I got the ball rolling, they would push it the rest of the way over the hill and I want the issue fixed in the next version of Go, so I submitted what I had even without going through and fixing all the tests (which I didn't have the expertise to do).

Now, this is not the same. Go is supported by Google and has money and developers, yada yada. But the Python Foundation is an organization. It has a payroll. It has official members. They could ask the EU for €5m grant to work with N devs for Y years fixing Python packaging… Whatever. I'm not in charge of PyPA, so it's not my problem.

But I do object to the gaslighting in implying that we, the users, are discouraging the poor, unthanked PyPA devs. No, they don't want to be helped, so they put out the vibe to make it clear we should leave them alone.


Well, at your use of the term 'gaslighting' I kind of like to quietly step out of saying, OK, whatever, this discussion has become especially unproductive.


If you know of another term for "someone else making you think the real problem is with you when in fact you're making a perfectly reasonable request" I'd be happy to use it to avoid triggering snowflake cons.


As a comment to your first paragraph, I've found pyenv combined with its virtualenv plugin to be a decently consistent experience.


I've never understood the need for pyenv-virtualenv when virtualenvwrapper exists. To add more confusion there is also pyenv-virtualenvwrapper.


And because those are all very confusing, there's Pipenv, which can interact with both, either or neither.


Do you mean installing Python itself or the needed packages? I think Python itself is really painless (just use os Python is you have a recent Linux or get it from Python.org). For end users just ship Python along with your app in the same package (Docker image, OS X bundle etc).


Using the system Python for development comes with its own problems. That Python is both an interpreter required by the system for numerous scripts and a runtime dependency for other packages. Using that gives people all sorts of problems, especially when "sudo pip install" is used. You can even break a system that way.

Gentoo is the only distro on which I feel comfortable using the system interpreters because the versions are slotted (you can have multiple versions) and system pip install is effectively disabled.

So on other distros I use pyenv. But trying to explain this to non-Linux geeks is hopeless.

I've taken to recommending the following: use system python/pip to install pipx (but with --user). Use that to install "global" tools. Use virtualenvwrapper to manage projects. Use pip-tools to manage frozen environments (ie. requirements.txt). I've looked into replacing the latter with pipenv (which is awful) and poetry (which is better) but I've so far been unable to generally recommend anything else.

I don't even want to think about other platforms and conda and all that.


I agree that there are potential problems w using system Python - the way I've had it work is to only support a recent Ubuntu distro and its Python version (latest LTS) and use the Python 3-included venv module for making a virtualenv. But yeah Docker is easier and more widely compatible, and works on Mac/Windows too.


Poetry is a modern packaging system for Python. It's really good. I hope everyone converges on it.

http://python-poetry.org/


Poetry is...a little quirky. I've seen it go into infinite loops trying to resolve projects with, like, 10 dependencies. I'm not really sold on it being better than just making a virtualenv and pip installing.


How do other people replicate your venv? How does production replicate your venv?


You pin all your direct dependencies in requirements.txt, run "pip freeze >requirements-frozen.txt" to get a list of the versions of all dependencies (direct and transitive), and then you ship that.

Replicating is just "virtualenv --python=python3 venv && . ./venv/bin/activate && pip install -r requirements-frozen.txt". Am I missing something here? Yes, to be absolutely sure, you need to have the same versions of python, pip, setuptools, gcc (for C extensions), C library dependencies, etc. on both systems that do the build, but that's the case for pretty much any build system out there if you want repeatable builds.

And if you you just don't feel like dealing with that, you can just use Docker (aka "put my machine in a tarball and ship it").


> You pin all your direct dependencies in requirements.txt, run "pip freeze >requirements-frozen.txt" to get a list of the versions of all dependencies (direct and transitive), and then you ship that.

What?! Seriously? That is not how it's supposed to work. Requirements.txt is already meant to be the frozen dependencies (like the output of pip freeze). Why change that? I much prefer how pip-tools does it with a new file, requirements.in that contains the unpinned direct dependencies. I thought poetry did the same thing but with pyproject.toml?


This is worse than how it’s done on most modern languages.


Different commands, but it's pretty much the same in similar languages: Ruby uses bundler configuration instead of virtualenv and gemfiles lock instead of requirements.txt. JS uses npm/yarn and lockfie. Go uses go.mod and does a lot of assumptions instead of virtualenv. (these days, it was worse)

Modern languages do pretty much the same, just have a nicer wrapper for it.


It's not "pretty much the same". Have a separation between Gemfile and Gemfile.lock is a huge improvement over just having requirements.txt. Having good, easy to use commands, that manages this stuff for your in a way that works well for the vast majority of use cases (i.e. bundler), is far better than the ill-defined, ad-hoc, mess that is requirements.txt management.


Gemfile-gemfile.lock is setup.py-requirements.txt.

And yes, as I said, a nice wrapper for management of it is missing in python. The main ideas/approach is the same though.


It’s really not the same. Gemfile lists the direct dependencies of your project with variable specificity of explicit versions. Gemfile.lock lists all dependencies after resolving your direct dependencies and locks them to specific versions. requirements.txt is some bastardization of both of these combined, and setup.py is neither of these it’s for distribution of modules/packages which Ruby has as well, but it’s not your gemfile.

edit: also, how do you manage dev dependencies with requirements.txt and setup.py?


You can put dev deps in extras in their own "dev" block. Similar to how gemfile uses groups. Use requirements-dev.txt for them. Requirements for is a full list of dependencies after resolution and their exact versions. (as produced by freeze) The main difference is that requirements is "as installed in current configuration", while gemfile.lock is "for any groups configuration".

Again, not as polished experience, but the model matches.


But it’s all ad hoc which is problematic. You’re solving it yourself, the tool isn’t solving it. It’s like saying that C is the same as python, you can just write your own garbage collector in C!


That's why we use Poetry.


yeah, that's what i tried to use for a while too. The problem is that so much of the ecosystem doesn't use it, and the python community fails to converge on a solution here the way other language communities do.


Sorry, I was a little terse — obviously you're right that you're not going to ship the virtualenv itself. As other people have said, the intent was that you create a requirements file (and/or maybe a setup.py if you need that functionality). What I meant is that between fiddliness and sluggishness of Poetry, and the marginal hassle of maintaining a requirements file and creating your own virtualenvs, I'm not sure Poetry is the clear winner at the moment.


Thanks. My main point is that bundler, maven (less so), and npm are all better than either venv/pip/reqruirements.txt or poetry. Additionally, it's also better that those language communities have mostly converged on those solutions.


Create setup.cfg (it is better than setup.py, since it is declarative) listing immediate dependencies, use requires.txt as a lock file (I prefer to use pip-compile from pip-tools).


Pipenv still works for me with everything version pinned.


That will probably be solved by the new pip resolver within the next few months.


Did that new resolver finally happen?

The pip SAT-based solver was supposed to land in 2017 or 2018 after a GSoC, and nothing came of it. Is it finally going to use a different heuristic than "#yolo" for picking which version of dependencies to install?



Unfortunately, you need to:

- discover it

- install it properly

- run it properly

- make it create a venv with the proper python version

- potentially import data from existing setup.py/cfg

- make sure you are always in the right dir when you use it

- not be on windows (poetry shell doesn't work there)

- IDE integration is not great

- people have to chose between this, pip, pip-tools, flit, pipenv, dephell, conda...

And you cannot use it if you are not "in a project". But not all venv are project related.


> - not be on windows (poetry shell doesn't work there)

Ok that's a bug, probably fixed soon. Otherwise, most of these are true of many popular packaging tools in other languages.


Well, not pip and venv. It's provided on windows and mac automatically. venv forces you to chose the version of python. venv doesn't care if you are in project mode or not. IDE integration is great.


We've recently standardized on using Poetry at JP Morgan Chase for projects in our Organization. We're betting on it becoming a standardized tool everywhere.


I just started up a project using this. It’s pretty good so far. The documentation on common tasks is a little bit lacking. VS Code integration is sort of broken in a few random ways (for example if I try and do a find all function definition it just hangs). I also wish there was a npm style script section in the definitions. I do think it really just works though and I think it’s a great solution compared to requirements.txt.


I echo the sentiments about documentation. It works great but they really need to verbosely document every command option and usage. They seem like they’re trying so hard to keep the website clean and minimal, but docs are not the place for that.


> I also wish there was a npm style script section in the definitions.

Is that like [tool.poetry.scripts] ?


I don't think so. From their docs: > This section describe the scripts or executable that will be installed when installing the package

I want something such that I can run `poetry run script_name` and it will run that command line in the poetry context. Right now I either rely on my bash history or use a bash script to mimic the behavior of npm / yarn scripts. I want to turn this `poetry run python project_name/main.py` into an alias like `poetry run main` or `poetry run dev`.


That's what tool.poetry.scripts does.


O thats great. So just weirdly worded on the docs / I didn't understand. Thanks!


Where is that XKCD about new standards...

Ah yes: https://xkcd.com/927/



Anything with Easy or Simple in its name, isn’t.

SNMP


None of that is "necessary". Rust doesn't have a BDFL and they've gotten along fine.

But I agree that the current situation is a mess. The "official" solutions are suboptimal and none of the unofficial solutions, which are pretty good, have enough clout to meaningfully simplify the ecosystem (held hostage by the majority of users with just pip).

So the ecosystem is kind of frozen in place. PyPA doesn't seem to have the will/capacity to do anything more than very small, incremental changes.


It's all about capacity, not will. The PyPA as a whole knows what is lacking, but when you're all just a bunch of volunteers it's hard to tackle massive projects that take potentially years to complete and then even longer to see gain traction in the community when you are doing it all in your spare time. And that's if you don't burn out before you finish (e.g. calling projects "garbage" is not motivating to the few people volunteering their time to try and fix the situation while keeping what is there afloat and running).


Yes. I am not familiar with the python development process, but atleast have they an announced roadmap to move to something better that the present suboptimal methods.


I teach Python to some middle school kids. Pip vs Pip3 and PYTHONPATH wasted entirety of two class sessions.

By the way, Python should go beyond Go. It should single binary the someone can install without root privileges. The fact that I have hundreds of Chromebooks at my school and I can’t use any of them - school IT, correctly, doesn’t want me mucking with advanced settings.


People would love to see that happen, but it requires time and effort from people to build such a solution. Some people are trying, e.g. https://pyoxidizer.readthedocs.io/ and https://briefcase.readthedocs.io/ as well as stalwarts like PyInstaller and such.

But as I said, it takes time and effort and there's only so much of that when it's being done by volunteers in their evenings and weekends.

And as an aside, I would suggest teaching them `python -m pip` over `pip3`: https://snarky.ca/why-you-should-use-python-m-pip/.


Thanks for the link. About the lack of resources problem though, I have a few points to make.

I think one of the unfortunate things that happened is indeed the arrival of new languages. Go, Rust, Swift, whatever is taking away a bunch of senior developers to work on those. Even though Python is still "most wanted" and "most popular", imho that does not reflect the reality among senior developers who are capable of taking on large tasks like this.

Sometimes the frustration expressed by package management has more to do with build dependencies rather than the distribution itself. Like a library that has a C dependency. In this context, the world of Go is equally messy, when CGO is involved. Luckily in their case, most people do rewrite libraries in Pure Go.


Thanks for the links - pyoxidizer looks promising with some good docs and being built on rust (but it won't work with QT's PySide2 likely until the next major release[0]), while briefcase looks to have been tailored to "beeware" usage and is lacking any standalone usage information. Hoping both these work out, PyInstaller works but other options would be welcomed.

0: https://github.com/indygreg/PyOxidizer/issues/228#issuecomme...


I really hope the PSF can help the community move toward Poetry. It's a game changer relative to all the alternatives.


For that scenario (if you can’t use the Linux container on Chrome OS, which works even without flipping the devmode switch), repl.it is fantastic. It builds a fully working python (or C, or Java, or even a polyglot one that’s just Linux) environment, automatically deals with dependencies (if you type import pandas and it’s not installed, it gets installed), has synchronous collaboration and can even auto grade work with unit tests (at least, it was in beta last I checked). It’s the holy grail of learn- to-code software and it kicked butt in the class I used it in.


You and your students can get access to a few online coding tools designed to help with this via the GitHub Student Developer Pack and the GitHub Teacher Toolbox (https://education.github.com), all for free.

Our online coding tool (and entire course library) is included in the Pack. https://next.tech/github-students and https://next.tech/github-teachers if you'd like to take a look.


Have you tried PythonAnywhere? [1] Should be a good fit for Chromebooks, assuming you've got decent internet in your class room.

[1] https://www.pythonanywhere.com/details/education


If you're working with new/inexperienced developers on disparate and various systems, please use a proper IDE like PyCharm that holds a developer's hand along the way. I'm not saying this because of preference (which I'll admit I have), but from a practical perspective. I've seen it first-hand countless times when someone new comes to Python. Especially "bootcamp" style developers that think they can do everything in VSCode. Don't mess around with custom, hand-rolled setups, you'll waste so much time as you've experienced.


You could consider teaching JavaScript is you want to use Chromebooks. They have built in support...


One can easily install Python without root. You have to build from source and use the --prefix option to, say, install under $HOME. Also helpful to avoid the system Python which usually lags far behind on systems like RHEL.


> "easily"... "build from source"...

You think the school IT department won't let the GP install Python but will let them install a compiler toolchain? The benchmark for "easy" is you download one binary and run it in userspace. Add any additional steps and you can't use that word anymore.


Why not just repl.it web ide ?


It already has that, if you create setup.cfg you can generate wheel package which can be installed with all dependencies using a single command.


I haven't looked much into it myself, but I've heard good things about Poetry [0].

0: https://python-poetry.org/


Poetry is good, but you should be aware that people with legacy versions of pip (I think before 19, see PEP 517) won't be able to pip install from your repository directly. That means anyone using pip installed with their package manager (up to the lastest version of Debian/Ubuntu in fact, which only provide v18). This doesn't affect installs from PyPI of course. So either you just accept this, or you also maintain a setup.py file as well. That's fine, but Poetry doesn't automatically bump version numbers in setup.py (as far as I know) so you need to do a couple of extra steps when you build.

On the other hand, setting up a deploy key to PyPI and then using poetry build, poetry publish is very easy compared to using twine which I found often did daft things like upload mixed versions to PyPI unless the build directory was clean. Poetry seems to handle that a lot better.


having just converted a large project/monorepo to poetry, its pretty easy to generate setup.py/requirements from pyproject.toml https://github.com/cloud-custodian/cloud-custodian/blob/mast...

note most of this just working around poetry for multiple packages in a single repo, with interdependencies. flip side, our pip/setuptools users building trunk from ci are unaffected.

that said there are open questions on poetry (governance/dev community), but the experience is overall pretty solid with a some minor rough edges, and the fact that it has a usable api makes those pretty minor. in contrast the only api on pip is cli arguments for subprocess/fork.


This looks great. Can anyone who's used both comment on how poetry compares to yarn[0]?

[0] https://yarnpkg.com/


I think of Poetry as the Yarn of Python packaging for two main reasons (apart from package locking, which is the biggest similarity): the commands are more intuitive and the lockfile starts with a different letter from the dependency file. I don't know how many times I open package-lock.json instead of package.json because I tab-complete from my shell. Same with Pipfile and Pipfile.lock for anyone using Pipenv.

I wrote this post[0] that compares Poetry and Pipenv if you're interested and talks about some of their differences. Poetry has both add and install commands, whereas Pipenv only has an install command, for instance.

[0] https://johnfraney.ca/posts/2019/03/06/pipenv-poetry-benchma...


Thanks, that's helpful and illustrative! Is poetry as good as yarn?


Python is a nice scripting and introduction to programming language, but "It is so far ahead of anything out there for general purpose use." it is not.

Maybe when it has a JIT/AOT story that matches Common Lisp, Scheme, Julia, Smalltalk, JavsScript, or I am able to do refactorings with the easiness that it works on Smalltalk, I can re-consider my opinion.


What the OP probably means is that it is hard to beat Python on its knockout combo of pragmatism, readability, stdlib, and third-party support.

Does that mean its the best at everything? Of course not. It is second best at most of the important things however, a more unique position than it sounds.


That being the case, then Java and .NET come to mind.


Those are great when you need big or fast, but py is much better at the smaller, easier to write end.


Ah, so Python is for small and slow, doesn't seem to be the best in everything after all.

Which brings us back to "nice scripting and introduction to programming language".


Yup.

Clearly no one is going to be able to convince you otherwise, so is there much point in you continuing to read this thread?


Indeed.


When the overwhelming majority of computer users will probably struggle with something a bit more complex than a Hello World type project, nice scripting and introduction to programming is light-years ahead of every body else.

This is no small feat.

Also the ecosystem of libraries is the same.

You can't see that from your ivory tower but those things matter a lot to people on the ground.

Don't let you prejudice get in the way, it never helps :)


Basic did it first, including the structured programming variants like QBasic and Turbo Basic.

I see that from my historical experience.

Also we are moving goal posts here, I was quite explicit that Python is a good language for teaching programming.

What I am against is portraying as the one Language above anything else.


I agree. This really seems like high impact too. A first party, supported solution would solve this mess. Getting the Python dependency build story under control I would include in this category. In this department, I think Conda shows the way. They leverage Azure Pipelines to build much of the c dependencies ahead of time, and just send binaries down the pipe (at least, that's the high level overview as I understand it).

I can't see a world where the Python Software Foundation can't get support for this, I think they have the contacts at the big companies to get it funded.

The lead Python has on machine learning, data science, and scientific computing won't last forever (though, I think its safe to say no matter what, it ain't going anywhere.


> I can't see a world where the Python Software Foundation can't get support for this, I think they have the contacts at the big companies to get it funded.

You might be surprised at how hard it is to get money donated for this sort of thing. Large companies often have their own build infrastructure and thus build things for themselves from scratch, to the point of building their own build toolchains (e.g. look at Google and bazel, Facebook and buck). So the way they handle packaging means how the overall packaging ecosystem functions does not impact them like it does individuals or smaller companies.

This is also why companies only started funding PyPI security work about two years ago; downloading source securely from PyPI is important to big companies as that's where they grab the source that they compile. But funding pip improvements only just happened in Decembe, starting with the new pip resolver work got money (https://pyfound.blogspot.com/2020/03/new-pip-resolver-to-rol...). And all that money came from Mozilla and the Chan Zuckerberg Initiative; two huge non-profits, not companies.

Add on to the fact that the PSF is projected to have a $627,000 USD loss this year due to PyCon US 2020 being cancelled and it makes finding funding really hard (https://pyfound.blogspot.com/2020/03/psfs-projected-2020-fin...).

Python overall very much runs on volunteers, but the packaging system especially. The core team of Python itself might generously have a total of 3 devs/week being paid to work on CPython spread across all contributors (IOW it is nowhere near as impactful as 3 actual full-time devs who aren't context-switching constantly between "work work" and "Python work"). But for Python packaging in general? If you don't count conda I think packaging has less than even 3.

But if anyone wants to donate to try and rectify this, please go to https://psfmember.org/civicrm/contribute/transact?reset=1&id... and donate to the packaging WG!


First of all, as an aside, thank you for all the work you do on Python! I actually got started pretty early on in my professional career with Python (3.5+ too, which I am thankful for in hindsight). I was using VS Code and the Python Extension before MS even acquired its developer (is Don still there? I hope he is. Such a genuinely down to earth guy). Without the work you guys do I don't know if I would have taken the programming the same way, or at that the time I did in my life.

I will say, I didn't realize the funding situation was _that_ dire, especially since on Talk Python I remember specifically Michael and some others talking about the massive influx of money into Python and its ecosystem in the last year and half or so. Its not billions of dollars but it sounded like the situation had a positive uptick substantially, so, in short, I was wrong!

Has the PSF ever considered having an actual business? Like say, consulting and/or having developers work on features that businesses pay to have implemented? (A loose example of this would be Igalia[0], which makes surprisingly good amounts of money to implement features in web browsers on behalf of the major browser vendors)

I've always thought this would be a pretty natural way to have a steady source of revenue and the profits could all feed back into the PSF.

[0]https://www.igalia.com/


> First of all, as an aside, thank you for all the work you do on Python!

Welcome!

> is Don still there?

Yep, working on the data science features of the extension.

> on Talk Python I remember specifically Michael and some others talking about the massive influx of money into Python and its ecosystem in the last year and half or so.

As I said there have been donations for PyPI and pip just got its big donation. The CZI donation was also much bigger than pip and covered a lot of the science ecosystem.

> Has the PSF ever considered having an actual business? Like say, consulting and/or having developers work on features that businesses pay to have implemented?

There isn't a need for the PSF to have a business subsidiary to accomplish that as people can donate money for such things if they truly wanted to. It's also really hard to manage due to tax laws in the US around non-profits and such. Plus there would be start-up capital costs, etc. just like any other business. I just don't think the risk of trying to get it set up would lead to any benefit when people have always had the option to donate directly to see if something could be done.

If people want to help out financially they can go to https://donate.pypi.org for packaging donations and https://www.python.org/psf/donations/python-dev/ for the language and CPython. And if someone wanted to fund a specific thing I'm sure the PSF is happy to discuss such an idea to see how feasible it would be.


That makes sense, on its face, as to why they don't have a business arm, at least currently.

One thing I would suggest, on its face, is that any time someone has to reach you (as in, the foundation, not you specifically), that introduces friction, even the smallest amount of it, can leave lost donations on the table. If there was a simple page even, that said something like

Do you have a feature you need implemented in Python? Are there features you would like to fast track? For a simple donation, we can prioritize the work you need to grow your business

This with a nice little contact form, would eliminate that friction, rather than being in a situation where I might be reaching out to the wrong wing of the foundation, or not feeling comfortable sending a general inquiry etc.

you could also put a feature page that tracks how much money needed to implement x too, to solicit donations. I've seen this work in other communities. I think Django has gotten funding using this technique before.

Just a thought, I know you guys are doing everything you can and then some, and Python is unique in that its one of the few language communities of its size without any formal corporate benefactor.


> For a simple donation, we can prioritize the work you need...

I'm not an expert, but I'm pretty sure a non-profit can't do things like this. If the person giving the money expects a tangible benefit in return, it's not a donation.


The specific restriction from my understanding is that a US non-profit that accepts money for a specific thing must spend every penny of that money only on that thing. So if the PSF was given $1,000,000 to solve a specific problem and while trying to do it they go bankrupt they still couldn't touch that money to stay afloat. So non-profits avoid that situation since they typically have enough to survive a year or so, but not enough to be able to restrict what they might need to do with their cash.


It would probably go a long way if they advertised a roadmap of things they’d like to work on given donations. Maybe they have this out there and I just haven’t seen it, but I imagine it would be easy enough for me to persuade my small employer to throw a few thousand USD their way each year until some sane packaging solution is developed.


There's a rough list at https://wiki.python.org/psf/Fundable%20Packaging%20Improveme.... There's no official roadmap as getting a group of volunteers to agree to a list of priorities would be near impossible. :)

If there's a specific pain point your employer would like to see fixed and is willing to donate money to help resolve that pain point then please go to https://donate.pypi.org/.


There has been a long debate about it between core devs, pypa members, conda CEO and a few key elements of the community recently.

The conclusion has been that:

- distributing story is handled ok, but distributing program is not.

- python has some unique problems due to its popularity, diversity of user base and long history

- the py command should be available everywhere

- communication on best practices is lacking, and people mixes project deployment, dependency management, lib packaging and program distribution

- the way python deals with imports is the root of all evils, not the packaging system, which actually is a consequence of it

The best course of action to take is still not clear though, as creating yet another layer on top of the mess is probably not the right thing to do.


I totally agree with this. Don't forget all the bespoke systems that companies end up implementing because they either aren't aware of those other projects or they started doing it before some of those other projects existed or because they aren't a fit.


FWIW, the issue you linked to was closed in favor of https://github.com/pypa/pipenv/issues/3369.

Looks like Canonical is paying him now.


I had some really bad experiences with pipenv. I don't think it is the way forward. It also seems like abandonware at the moment so I am glad there's at least work happening.


Yes the python environment space has exploded and now feels like JavaScript with multiple competing solutions and no obvious go-to one.


> Python is otherwise the most incredible language of our times

This is a strong claim. Ruby could arguably make the same claim. So could Elixir.


Given none of them are used outside of their specialty, no.

For Python, it's the reversed: it's used everywhere, except for a few topics.

It's the only scripting language like that.


People have this bias that Ruby only used with Rails. Yes, Rails made it popular in the first place, but it has overgrown it. Other than data science/machine learning (because of the lack of libraries/frameworks) Ruby used in all over the place, just like Python. Ruby basically ruled the DevOps space before Go and still very relevant.

Yes, Python is winning the popularity battle no doubt on that, but that nothing to do to the language or the surrounding tooling itself. It's like saying VHS is an incredible format compare to Betamax(Yes I'm old :D).

I used Ruby (not Rails) and Python for many years by now. I think Ruby and its tooling are far better than Python's. But if anybody asked me which language should they learn as their first, it would be Python, just because there will be more jobs available.


How is go the poster child for clean dependency management? Every release since they started doing modules the defaults change and break some workflow. Add on top of that it activity pushed vendoring forever.

I'm not super happy with package management under python but at least pip does not break every release.


Containers solve this for non-GUI apps, and pip still works for 90+% of use cases. Not sure if yet another solution would improve the situation, would just be more fragmentation and competition with more general (non-Python-specific) solutions.


python distribution is so annoying I always wrap it in docker and call it the day.

I can never trust pip for on-premise installations of python code. Even the npm toolchain madness is somewhat more consolidated and organized.


Wonderful language, crippling ecosystem.


I personally don't think that comic is fair to hold against pip or the packaging ecosystem. I did a separate post on deconstructing it at https://snarky.ca/deconstructing-xkcd-com-1987/.


> Python is otherwise the most incredible language of our times

By what metric(s)?

It's based on statements, and not only on expressions. This is a fundamental issue that it has yet to grow out of.


Downvotes because of what I think about the language. Zealotry at it's finest.


Welcome to HN. They aren't interested in your flippant opinions, man.


The opinions are serious on both sides. If I don't provide criticism, it's a strawman post. Everyone is wrong in criticism here, except when the specific topic is criticism and for that, we get tons of noise about Python. At the same time, it's not a open forum when one side is persecuted.


To expand, you didn't give enough information to decide whether your comment made sense or not. That makes it non-useful in the end. Such comments are downvoted here. I had to learn this myself, as I like to be sarcastic most of the time.

If you don't have something substantial, cited, and/or supported to say, and it is also negative, better to not write anything.


I don’t want to be too negative, and I realize this isn’t an option for everyone ... but seriously, just use conda if you can.

Judging from comments here and in the past, we all know packaging is one of Python’s Achilles’ heels. I spent a couple years bouncing around with pyenv/pipenv, then poetry, then back. I lovingly followed all the flamewars here and on github.

Ultimately, I realize now, I laboured under the conceit that it was desirable to have a “native” solution that did not depend on a suite like conda. I now think I was totally wrong.

I’m so much happier and more productive now that I’ve switched. For any data science use case conda is hands-down better, and for everything else it’s still more useable than any of the alternatives. I hate to say this but it’s true. The only thing I can think of that was harder to work with a conda setup was the keychain in macOS, and I’m not even sure now that that’s an issue anymore.

conda really does just work.


I find the original virtualenv system works well. There's no need for virtualenvwrapper or pipenv or any of that nonsense. Just make a virtual environment whereever you like ("virtualenv -p python3 ~/envs/myenv" if you want to share between projects, or "virtualenv -p python3 path/to/project/env" if you don't). Use "pip install -r requirements.txt" if your project has one. Activating and deactivating is straightforward. Deleting a virtual environment is easiest of all - literally just delete the directory and you're done.

The main problem that Conda was originally meant to solve was that basic Python packaging (virtualenv, pip) required you to build C extensions yourself which is a huge pain (mostly getting all the C dependencies installed). But with the advent of binary wheels and (eventually, in 2016) a binary wheel standard for Linux [1] you can now get binary builds of most packages on most OSs by just doing a "pip install packagename" - typically this is even more likely to work than Conda in recent years.

The only limitation of virtualenv is it requires you to have already installed Python yourself. On Linux I just use the system-installed version (the Python 3 one obviously), but on Windows I use Conda to create the environments and then pip to install the packages, which works fine.

[1] https://www.python.org/dev/peps/pep-0513/


I've had the opposite experience. I used conda for several years, primarily for data science, but regularly ran into problems with packages not working, updates for packages bring severely delayed, and issues with conda itself. Switched to pyenv+virtualenv 18 months ago and haven't had a single issue.

I believe both of our stories. I just always wonder how this happens that two people have such opposite experiences.


Just to note: this kind of difference is part of why it's so difficult to improve packaging. There are lots of different tools, and they all work well for some people in some situations. There's no realistic prospect of persuading them all to use a single tool - but the fragmentation is a big part of what makes packaging so daunting.


Ha!! Likewise, I believe you 100%. It’s fascinating isn’t it? Just goes to show what a diverse (and frankly wonderful) platform Python is, despite the warts.

If you don’t mind my asking, what kind of stuff we’re you working on? Just curious!

I have definitely found it’s easy to bork a conda install with updates, enough to make it easier to just wipe it out and start over.

I mostly work on financial software and only conda seemed to play nicely with pandas and brokerage APIs. I should probably mention that I do a lot of this is cross-platform and conda seemed to be the only thing that worked without a lot of tweaking on Windows, Mac and Linux consistently. But the Linux experience with pyenv and virtualenv (and pipenv too, for the most part) was great. If I was Linux-only I honestly might never have switched.


Because conda and other tools don't mix.

If you only use conda, it works.

If you start to pip install or use venv, it will break at some point.


Default Python's Venv puts reproducibility of environment and scientific computing results at risk, by not creating a file which includes all the checksums recursively. One needs to take extra steps to achieve that. It is possible though.


We’re you using conda on windows or not windows? (Just curious)


Conda envs often get quite big and resolving deps takes time.

Add to that, that you download binaries, which you have to trust.

Another issue is, when there is no binary and it needs to compile stuff. On GNU/Linux that might work fine, but on Windows, like for some of my coworkers, you need Visual Studio (MS bundles it that way and I could not find a way around installing that whole thing) installed, for getting the required compiler stuff and even then the compilation might not work.

Conda in the past also had issues with reproducible builds of envs. I hit a case where on one OS it worked, on the other it did not.

Also I had a case where deps of deps change and when you try to build the env, suddenly it does not work any longer. That's very bad for deployments.

Since then we abandoned Conda entirely and built Python with all requirements ourselves. Envs need checksums.


Conda has problems but generally it is pretty incredible. It was way more useful before python wheels were everywhere tho. It’s still amazing and you can install so many non python related things with it. Mad props to conda-forge too.


Is toml even supported in the standard library? This seems just bizarre. Python's really overtaking node for massive churn. I feel like an old guy when peers say, "don't use virtual environment use pipenv." then someone says no no don't even use that, use poetry. Etc.

I think about 50% of why I enjoy, in my soul, about Rust when I'm hacking away and personal goofs is that the dependency and project management system is all first class and pretty decent and I sure hope they don't go changing it much.


Here's their justification for using it. It makes sense to me.

https://www.python.org/dev/peps/pep-0518/#overview-of-file-f...


It makes sense to use toml. It makes zero sense not to bring pytoml in the stdlib before doing so. The way this was handled is a little absurd.


It was on purpose because TOML has not reached 1.0 yet and breaking people using a module in the stdlib for parsing TOML because TOML itself changed would be an absolute mess. Plus not a single packaging project said it was an issue to either depend on or vendor a TOML parser.

Once TOML hits 1.0 I plan to work on getting a TOML parser into Python's stdlib.

(disclaimer: I am one of the co-authors of PEP 518 that apparently handled things in way that was "a little absurd" .)


> The pytoml TOML parser is ~300 lines of pure Python code, so being outside the standard library didn't count heavily against it.

From the link. Makes sense I think.


The PEP has provisional status. Maybe before it lands it will be brought in.


But really goes against 'batteries included' not having TOML support in standard library.


Rust management system is first class because the language has been designed from scratch 10 years ago.

Python has been around since 1991. It came out before installing dependencies from internet was a thing. It had support for Windows XP, Atari, Solaris. You gotta be able to pip install stuff that are written in C (numpy) or in Fortran (scipy). It had been implemented in the JVM, .net, and Python itself using pypy.

All those years, what we knew about or needed from packaging system changed, and the tooling followed. We had easy_install, then setuptools, then distutils, then distutils2, the pip + setuptools again. We had virtualenv, then venv.

You cannot break from the past easily, especially on such a popular language, with half of it's user not even being devs.


While you're not wrong that it's much easier to build systems with the advent of foresight, Rust also supported Windows XP, Solaris, stuff in C or Fortran. That's not the issue.

It's really purely the "designed from scratch 10 years ago" (though, only six years for Cargo).


Python also doesn't support yaml in the standard library, interestingly.


Yaml is a big mess.

I do really like strict-yaml, however (last time I checked) it is just a hack to simplify the large PyYaml compiled package.


Thanks for the vote of confidence.

fwiw version 2.0 of strictyaml will feature its own specification and parser.


Great to hear. I've been manually deactivating almost all of the features of PyYaml for almost a decade now. Over and over. So quite happy someone is taking up the cause.


The thing i have yet to reliably figure out is this (i live in the science part of python):

without a setup.py, how do I:

  from Cython.Build import cythonize 
  import numpy as np
  import versioneer
  setup(...,
       version=versioneer.get_version(),
       cmdclass=versioneer.get_cmdclass(),
       ext_modules=cythonize(extensions),
       include_dirs=[np.get_include()],
       ...)
I'm 100% on board with not having executable configuration files, except that we've been happily abusing them for years now and there are a lot of edge cases that i rely on now.


All of those setup options are now setup.cfg options:

https://docs.python.org/3/distutils/configfile.html

At least, that's supposed to be the case; I am not 100% certain that it always is. However, even so, AFAIK setup.cfg does not directly support using third-party libraries to compute the setup options, as your setup.py does. The only workaround I'm aware of for that if you want to get rid of setup.py is to have a separate script that generates setup.cfg using whatever libraries you need. However, I don't think that's the vision of the Python developers.


There are lot that of options that setup.cfg doesn't support. "use_2to3" is one.


I never needed it, but I see 2to3 options there [1], are they not working?

[1] https://setuptools.readthedocs.io/en/latest/setuptools.html#...


I get the idea -- but, how do I tell Cython "it's time to compile things now" from a file I can't execute?


> how do I tell Cython "it's time to compile things now" from a file I can't execute?

It looks like there are ways to specify which build tools your package requires; I would assume Cython would be one of the tools that can be specified since it's been around a while and gets a lot of usage, but I don't know for sure.


pyproject.toml and setup.cfg can in some cases make a setup.py unnecessary.

That does not mean you should not use it anymore if you have a good reason to use it. The ones you listed are examples.

How ever, as explained in the article, you should still add the pyproject.toml


Our experience using pyenv, poetry and a Pyproject.toml for projects lately has been great. I highly recommend going this route compared to dealing with pip and pipenv.


Apart from packaging I also love the idea of consolidating configurations from different dependencies into a single place.

But when I first moved to pyproject.toml I didn't find its use as widespread as I hoped (flake8 being the dependency I hear about the most without support), so I started collecting a list of projects currently using it, and discussing its use, in an awesome list format. Hopefully it will be useful for other people as well:

https://github.com/carlosperate/awesome-pyproject


I have found this blog post[1] quite informative regarding Python projects packaging. It describes well how all the tools (pyenv, poetry, black, flake8, isort, pre-commit, pytest, coverage, tox) fit together.

Poetry seems the go-to tool for dependencies management and it centralizes all your package configurations alongside its dependencies in pyproject.toml.

[1] https://medium.com/georgian-impact-blog/python-tooling-makes...


Still have some critical bugs here and there[1], many about resolving dependencies. Don't get me wrong, I use it for my own current side project github.com/rmpr/atbswp, but imho a lot of things need improvement.

[1]: github.com/python-poetry/poetry/issues


Announcing a new Sponsorship Program for Python Packaging: http://pyfound.blogspot.com/2020/04/sponsoring-python-packag...

Discussion here: https://news.ycombinator.com/item?id=22772410


I quite like the pyproject.toml; it + poetry + dephell is (relatively speaking) a breeze. Both poetry and dephell are quite good and they don’t not mesh with conda. Python packaging is awful but it’s gotten way better; it’s still maddening but it’s better.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: