Hacker News new | past | comments | ask | show | jobs | submit login
Poetry – Dependency Management for Python – v1.0 (github.com)
129 points by devj 77 days ago | hide | past | web | favorite | 73 comments



We have moved our dependency management from pipenv to poetry and hasn’t been more happier than now working with and publish Python modules.

Now we use pyenv with poetry to run all our projects in Python with venv in project instead of decided by poetry default settings. We changed the settings with:

  poetry config settings.virtualenvs.in-project true
After this in hgignore or gitignore added .venv folder.

Now everything is the way we wanted, with Pipenv many times faced issues that the project worked with requirements.txt but not in Pipenv. Didn’t have this issue moving to poetry yet.

Also with emacs support and automated deployment tools the predictability of poetry made our team easy to work with and publish Python application modules.


fwiw I believe

    poetry config settings.virtualenvs.in-project true
is now

    poetry config virtualenvs.in-project true


Nice. I do however much prefer keeping the venv out of the source tree where they do not interfere with `tree` and `grep` and whatnot. I'm curios what benefit you see with having the venv cluttering the work-space ?


It took me a while to be convinced to switch to in-folder venv but the benefits are

* VS Code configuration so much easier as you can point to the correct env using a relative path in settings.json and check that in

* Predictable paths to execute tasks / build

* No outside of development tree vens that you forget to clean up or forget to deactivate

* No different to node_modules, .git or similar when it comes to "polluting" the local project tree as you can .gitignore it


> as you can .gitignore it

As well as ignore config in codecov, black, flake8, mypy and so on...


We setup systemd with gunicorn or uwsgi start in the project venv. So when they are created inside the project folders is easy to have the automated deployment for Python softwares easy with systemd services or local scripts.

Our developers use mac, windows and Linux for development and each one have different location of venv with poetry, if we start gunicorn or uwsgi they may make mistakes. With predictable venv location all can work the same way from within project directory.


If you use a grepping tool like ripgrep it will automatically ignore everything in your `.gitignore`, so at least for that specific case you're covered.


Your app is self contained in a single folder (not 2), run scripts are simpler (can use relative paths like ./venv/ Scripts/foo, no need to “activate”...


The recommended installation method still doesn't work without manual intervention. For the reference, I use WSL with the latest Ubuntu LTS version.

- I tried using the recommended installation method of

    curl -sSL https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py | python
Didn't work. I don't have python installed, just python3. Fair enough.

- Using the above method, but replacing `| python` with `| python3`: The installation works. But trying to invoke poetry gives this error:

    /usr/bin/env: ‘python’: No such file or directory
Turns out poetry executable is calling python in the shebang.

- Manually changed shebang in `$HOME/.poetry/bin/poetry` to `#!/usr/bin/env python3`: Now poetry runs successfully.

I mean, it works, but it is fragile (shebang probably needs to be manually reset at every update), not beginner-friendly, and not ergonomic. Installing using pip3 has the same issue. In comparison, pipenv is much easier to install. I was hoping this issue would not be present for poetry v1.0, but unfortunately it is.


Curious what's stopping you from symlinking python3 to python? In fact, I'm surprised the package manager didn't already do that (I thought it did for Debian and its derivatives, including Ubuntu). Does WSL throw a wrench in that or something?


Because python link/binary is python 2, which you might also have installed

Being able to rely on python being python 2 and python3 being python 3 is pretty useful


The problem is exactly with you having `python3` but not `python`. The convention is to use them both only if you have both Python versions 2 and 3; otherwise there should always be a `python` command on your system. If you insist on having only `python3` you will keep running into problems.

My recommendation is to either create an alias for `python3`, or even better use `pyenv`: https://github.com/pyenv/pyenv


I had the same experience. It tells me that poetry is on the right track but not ready for serious use yet.


Even putting aside it's approach to dependency management, Poetry has a fond place in my heart for making it so much easier to publish to PyPI.

If poetry did nothing but simply the process of publishing and updating python packages, it would still be an amazing tool.


I've read of another project for publishing to PyPI called flit. Anyone knows how they compare to each other?


Poetry tries to help you manage your whole project while flit is just for building your library (and optionally uploading; you can always use twine to do the upload). It's all-in-one versus one-job-one-tool.


Simpler PyPI publishing sounds very appealing.

I still don’t feel hungry for new dependency management solutions (requirements.txt + virtualenv works reliably for me) but I do like the idea of refactoring setup.cfg and setup.py for simpler PyPI publishing.


While I'm excited that Poetry hit 1.0, I'm actually more excited about the fact more devs were added to the maintenance team and that it transitioned to a project namespace. Sébastien was previously the sole dev with commit access so I'm hopeful this gives the project a much healthier future.


The project org has two (public) members and the second highest contributor has 10 commits with <600 lines. It would inspire more confidence if the project is transferred (invited) to pypa some day.


I've been watching poetry for the last couple years, hoping that it is THE replacement to the virtualenv/virtualenvwrapper/pyenv workflow.

pipenv never felt quite right. poetry has felt really close.

I'd love to know if anyone has made this jump and their experience.


Same here. I got really excited by pipenv, but every single time I've tried it, it takes 10+ minutes to determine the dependencies and build a lock file (see issue 2284 [1]). It just makes it un-usable for a lot of my use-cases, e.g. setting up a clean environment for exploratory data analysis. Does anyone know if poetry improves in this area?

[1] https://github.com/pypa/pipenv/issues/2284


(See my other comment)

Yes, I haven't encountered any of the slowness in dependency resolution that Pipenv has with Poetry. It does still feel a bit slow compared to package managers from other languages, but not in a buggy way, so I would attribute it to not being that optimized yet.


Same here. Been using Poetry for a few months, and it's been a blast!

Before that I tried to use pipenv, but it's been too buggy and unpredictably slow with installs. I've always had bad experiences with virtualenv etc., and mostly stayed away from Python development due to that. With Poetry that has changed immensely, as I now have the reproducibility in environment I'm used to from Rust/Ruby/Node.

All that said, I think there is still quite a way to go for Poetry. E.g. I had (hard to reproduce) bugs where if the install was canceled, I had to purge the whole virtualenv, as it was unable to recover itself. I'm pretty opimistic about the project as a whole though!


hopefully v1.0 now means it's pretty stable. using poetry so far has been painful, the gains you got with the improved dependency resolution were unfortunately somewhat impacted by crazy annoying bugs, like `poetry shell` not actually activating the virtual environment [0]

[0] https://github.com/python-poetry/poetry/issues/571

edit: gah, still hitting baffling errors:

  $ poetry env use /usr/local/bin/python3
  [NoCompatiblePythonVersionFound]
  The specified Python version (3.7) is not
  supported by the project (^3.7).
  Please choose a compatible version or loosen the python constraint specified in the pyproject.toml file.
  $ poetry env use /usr/local/bin/python3
  [NoCompatiblePythonVersionFound]
  The specified Python version (3.7) is not
  supported by the project (>=3.7).
  Please choose a compatible version or loosen the python constraint specified in the pyproject.toml file.
v1.0.0 generally seems to work smoother though


Does anyone know if the situation with using `poetry` with `pyenv` has gotten any better? I like poetry's approach, but I had a very difficult time getting it to play nice with my system last time I tried.


I use this workflow with poetry 1.0.0 just fine


For whatever reason, I've never understood the point or actual benefit of virtualenvs.

Having lived more in the JS ecosystem for the last several years, my ideal workflow would be a copy of how the Yarn package manager works:

- Top-level dependencies defined in an editable metadata file

- Transitive dependencies with hashes generated based on the calculated dependency tree

- All dependencies installed locally to the project, in the equivalent of a `node_modules` folder

- All package tarballs / wheels / etc cached locally and committed in an "offline mirror" folder for easy and consistent installation

- Attempting to reinstall packages when they already are installed in that folder should be an almost instantaneous no-op

PEP-582 (adding standard handling for a "__pypackages__" folder) appears to be the equivalent of `node_modules` that I've wanted, but tools don't seem to support it yet. I'd looked through several Python packaging tools over the last year, but none of them seemed to support it yet (including Poetry [0]).

The only tool that I can find that really supports PEP-582 atm is `pythonloc` [1], which is really just a wrapper around `python` and pip` that adds that folder to the path. Using that and `pip-tools`, I was able to mostly cobble together a workflow that mimics the one I want. I wrote a `requirements.in` file with my main deps, generated a `requirements.txt` with the pinned versions and hashes with `pip-compile`, was able to download and cache them using `pip`, and installed them locally with `piploc`.

Admittedly, I've only tried this out once a few weeks ago on an experimental task, but it seemed to work out sufficiently, and I intend to implement that workflow on several of our Python services in the near future.

If anyone's got suggestions on better / alternate approaches, I'd be interested.

[0] https://github.com/python-poetry/poetry/issues/872

[1] https://github.com/cs01/pythonloc

[2] https://github.com/jazzband/pip-tools


> - All dependencies installed locally to the project, in the equivalent of a `node_modules` folder

That's basically what a virtualenv is, the python version of a node_modules directory. You just need to explicitly activate/deactivate to move in and out of that environment, instead of it being based on the directory tree. And the requirements file is equivalent to the "dependencies" section(s) of package.json


Node and npm are no better than JavaScript tools used for dependency management. I had a project in node 6.0 refuse to work with node 8 and npm failed to upgrade. No different than python 2.7 to Python 3.

Just browser and JavaScript silence the errors and might result in things cannot be seen. This is not the case with Python “No errors must pass silently unless explicitly silenced”.


Totally agree; node_modules starts to break horribly when you use the wrong node/npm command to access it. This is unfortunately causing a lot of misconceptions because people usually judge tools by short-term experiementations. node_modules works very well in that scenario and gets the most love--until it doesn’t, you tear hair for those nasty bugs, and sadly can’t even convince people with your experience (“it works perfectly for me, I don’t know what you’re talking about.”)

I guess the node_modules approach is fine for Node, since the native binding story is quite bleak and no very people are deep into it anyway. But for Python this would be a complete no-go given how pervasive extension modules are. I don’t know what went through Ian Bicking’s mind when he first designed virtualenv, but so many aspects of it just reflects that tight cross-language integration Python is so good at (the bin/include/lib structure, copying/symlinking binaries into place, setting up env vars to trick foreign build systems into working, etc.). I wouldn’t want it any other way even if I could go back in time and involve in its original design. Well, maybe a few minor improvements, like offer a more beginner-frieldly interface than activate scripts, or more heavily promote running the binaries directly, but the design is just… correct.


What's the advantage of using poetry compare to vendorize your dependencies and check in the whl files (which can be manylinux binary) into your version management system?


This is actually pretty straightforward:

  pip wheel -w ./wheelhouse -r requirements.txt
Then you can modify your script to insert every whl in sys.path. Add wheelhouse directory to version control and you get a repo that can easily reproduce anywhere that manylinux is supported.


Why would I use this instead of Anaconda?


They are pretty different tools. Poetry is a dependency manager and packaging tool, Anaconda is a pre-configured environment for scientific Python.


I guess the comparison would be to Conda which also allows for declaring dependencies and building environments.


They are referring to conda, which is a generic package manager, dependency resolution and build tool.


They might be referring to conda, the package manager used with Anaconda.


Can you define dependencies loosely and still have a lock file in Anaconda so when you come back to the project a year later you can install the exact same dependencies at the same versions? (Honest question, not familiar with Anaconda.)


If you have these requirements, why not vendorize all your dependencies?


So you want to define your dependencies loosely or lock them?


You can do both! Your general file can loosely define dependencies like greater than this version, less than this version, skip this version, etc

Then you “lock” and it builds a dependency tree with the newest version of everything that satisfies your loose requirements.

Then you can choose to install dependencies from your lockfile, which is exact frozen versions OR you can re-run the lock later, which will again attempt to update everything to the latest versions that satisfy your loose constraints


You can pip freeze


Why would I use Anaconda instead of Nix?


A bit of a shameless self-plug, but you can use both Poetry _and_ Nix! https://github.com/nix-community/poetry2nix


Is there a good way to build a portable package containing dependencies for installing on an airgapped system?


Take a look at Twitter PEX. I gave a short talk on it earlier this year (https://www.youtube.com/watch?v=abnwINA50DE).


Pex doesn't support pep 517 but shiv does.


Nice but ugh. Python packaging world really sucks. In R I was able to do this very easily with minicran.


Another one: https://github.com/facebookincubator/xar "XAR lets you package many files into a single self-contained executable file. This makes it easy to distribute and install... A popular example is Python application archives that include all Python source code files, as well as native shared libraries, configuration files, other data."


Have a repo for your code, vendorize every thing, build a binary including Python interpreter using a build system that emphasize on reproducibility (e.g. Bazel).

Alternatively you can build a docker image using Bazel (for reproducibility) and use tools like Droot to chroot into the image then run.


Do you depend on any native library? If not, you should be able to install everything in a virtualenv and move that whole directory.


Surprised no one has mentioned cx_Freeze yet:

https://cx-freeze.readthedocs.io/en/latest/

It's worked pretty well for us on both Linux and Windows environments.


Make a wheel or use shiv. Either is very easy.


Is it possible to build a wheel for a unix platform on a windows machine, when the package and some of its dependencies have compiled portions? I'm trying to build a wheel with deps for exchangelib and have been struggling.


For certain values of 'good', containers seem to enjoy popularity.


Shivs as mentioned elsewhere, pyoxidize is also worth a look


I thought the billions ways to sort out dependencies and environments Python were its messiest part. I've been using a custom little batch script on windows to streamline that a bit as I mainly use python for small utilities as opposed to big projects - https://gist.github.com/leoncvlt/10e67d9415e61eff0f5010ef6fe... - but interested in giving this a spin!


Big discussion about project 4 months ago:

https://news.ycombinator.com/item?id=20672436


Since pipenv is pretty much dormant, this seems to be a nice alternative. Thanks for sharing!


Just to note, dev dependencies in Poetry is still rather flaky vis-à-vis projects forever in pre-release mode e.g. Black. Consider sticking with Pipenv if you have heavy dev dependencies requirements.


I have been bitten by this. I love Black to death, but I don't understand why they're still tagging their project as bêta when it's been so stable over the last year, at least.

Anyway, the solution is simple: put

black = { version = "*", allows-prereleases = true }

in your pyproject.toml file. Or use --allow-prerelease as suggested by others.


It definitely was in the past, but I haven't had any troubles recently. Are you still experiencing issues?


Isn't that Black fault more than Poetry fault? They do not have a non alpha release.


Nowadays its just --allow-prerelease


If I read the 1.0 changelog correctly, I think you don't need that flag anymore


Messing around with major/minor Python version compatibility and dependency conflicts has always been my least favorite part of Python. A tool like this is much appreciated!


This looks pretty nice! Definitely reminds me of the node ecosystem.


it has a nice comparison to pipenv


I don't think that was a nice comparison.

It just says that he doesn't like the pipenv cli, and some decisions made, but doesn't say why or how poetry is different.

The only thing it goes into detail with is a bug in pipenvs dependency resolution, and how their resolver avoids that bug.


It's not clear from the current title but Poetry v1.0.0 was just released: https://github.com/python-poetry/poetry/releases/tag/1.0.0


Looks like Randall's comic needs an update!

https://github.com/dephell/dephell

https://xkcd.com/1987/


Big up to dephell (https://github.com/dephell/dephell) which makes it pretty straightforward to convert from one dependency management system to another, while the dust settles.


Needs updating to include pyenv also.




Applications are open for YC Summer 2020

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: