
Poetry: Dependency Management for Python - jtanderson
https://poetry.eustace.io/
======
SmirkingRevenge
After years of python dev professionally, I just quit worrying about it, and
embraced:

\- vanilla virtualenv: I don't even bother with the wrapper most of the time.

\- vanilla setup.py/setup.cfg: Its just really not that bad. Forget about
project.toml or whatever the next big thing is.

\- pip-tools: ditching pipenv and using this for pinning app requirements, has
made my life so much simpler.

~~~
theptip
`pip-tools` has served me well, but an annoying gripe that pipenv/Poetry
resolves is that you can't install git links in non-editable mode, i.e.

    
    
       -e git+https://github.com/...
    

Works, but

    
    
       git+https://github.com/...
    

Doesn't.

Also can you pin your dependencies to a specific hash? Last I checked you
cannot. A major advantage of pipenv/Poetry is that the lockfile protects you
against compromise of the package manager; if someone swaps in a new binary
for yourdep==1.2.3, the lockfile will fail to validate the hash and you'll get
an error.

~~~
taion
pip-compile added support for pinning hashes in 1.8.0 (in 2016) and support
for non-editable URL dependencies in 3.7.0 (in May).

------
1337shadow
If you like yarn I think you will like poetry.

Using --user here, no per-project virtualenv, used to have a virtualenv in
~/.env, where i keep everything up to date and have many -e installs from
~/src. For production deployments: --user in containers.

This means all my projects have to work with the same versions of
dependencies.

This means all my dependencies have to work with the same versions of
dependencies.

All my dependencies have to be up to date: all versions must be latest, or
next to-be-released.

When don't support new versions of other dependencies: I deploy a fork and
open a PR.

At such, I'm not spending time trying to make obsolete versions work, rather,
I'm spending time making new versions work together.

My perception is that this strategy offers better ROI than all others which I
have tried, not only for me, but for the whole ecosystem. That's also how I
made my Python 2 to 3 transition, and have been using 100% Python 3 for a
while (once you've ported one big software, porting others and their smaller
dependencies is easy).

I'm extremely happy about this strategy and my life has just been better since
I started working this way, and highly recommend it.

For my own (countless) packages: I use setupmeta which simplifies auto-updates
(I think it succeeds what pbr tried to achieve).

~~~
dec0dedab0de
_This means all my projects have to work with the same versions of
dependencies.

This means all my dependencies have to work with the same versions of
dependencies._

Then why not just install everything globally?

~~~
1337shadow
Not to mess with system packages, but I sometimes do so in containers. I just
started to switch to --user in containers so that my users would not have to
rebuild python dependencies from scratch when they use it for development:
with source bind mounted into /app, which is declared as the home of the app
user in my container, as such, .local remains between builds on the developer
checkout (like node_modules)

------
micimize
There are a few highly dismissive comments here. While the python community
has (clearly) been getting by with requirements.txt/setup.cfg/setup.py, the
project/package management story is far less usable than more recently
developed systems like npm and pub.

With poetry+pyproject.toml, I have one file that is flat, simple, and
readable. @sdispater has done incredible work on this project, and I hope they
get more resources for development.

~~~
1337shadow
Your comment does not explain why you think that the package management story
is better in npm, it's merely like saying "node-gyp is a pile of debt" and not
going into details, but I can surely tell I have no idea why node_modules ends
up so fat and why my npm install eats a lot of bandwidth, time and disk space
... is npm trying to workaround incompatibilities by installing different
versions of a same package in the same node_modules ? I hope not, because that
would be just perfect to accumulate debt in the whole ecosystem. Not to
mention that I was just handed a frontend source code where building the
source requires node-gyp which requires g++ and python2 :)

Quick question since you seem to know npm very well, is there a better
solution than this to automate npm package publishing nowadays ?

sed -i "s/GIT_TAG/${CI_COMMIT_REF_NAME/v/}/" package.json

(I don't have this issue with Python's setupmeta)

------
korijn
It took me a while to learn to love pipenv. It also bothers me how many
developers "blame pipenv" for any problem they can't easily explain.

Anyway, has Poetry caught up to the dependency resolution provided by the
pipenv lock command yet? Last time I tried it (~6 months ago), it couldn't
produce working environments for my requirements.

BTW, for anyone who wants to educate themselves, read the issue description
here and follow the links:
[https://github.com/pypa/pip/issues/988](https://github.com/pypa/pip/issues/988)

~~~
guitarbill
I do really like the workflow pipenv provides. But last time I checked, the
mayor downside of pipenv is it doesn't and won't support multiple Python
versions [0]. For libraries, this can be a dealbreaker. For applications,
especially ones you're deploying to a controlled environment, this isn't an
issue.

Been meaning to look into poetry. Not a huge fan of TOML, but hopefully all
the tooling supports pyproject.toml now (I know black does, but not sure about
flake8, isort, pylint, pytest, coverage). I know there's still some hacks
required for tox + poetry though.

[0]
[https://github.com/pypa/pipenv/issues/1050](https://github.com/pypa/pipenv/issues/1050)

~~~
meowface
What don't you like about TOML? It seems to be the best language for simple
configuration files. I think YAML is often overkill for basic configuration,
and JSON isn't ideal. pipenv's Pipfile is also in TOML.

What config language would you prefer to use instead?

~~~
guitarbill
To be honest, I don't know. All I can say is YAML parsers exist and it works
well enough for many applications, although I do understand the issues with
it. Sure, it isn't perfect, but no need to re-invent the wheel for marginal
improvements.

I roughly agree with the arguments laid out in PEP 518, even though the array
of tables thing is not at all "obvious" to me. And I actually do think an
"official"/semi-official YAML subset would be great for loads of use-cases -
yaml.safe_load already is that in practice.

------
angrygoat
The big win seems to be the lock file - like Cargo in Rust, or yarn in the JS
world. It's really, really hard to lock down dependencies reliably in Python,
especially when you are talking dependencies of the primary packages you are
installing (and their dependencies, ...).

One solution at the moment is to run 'pip freeze' and put that in as your
requirements file, but that very much feels like an 'and now I have a
different problem' solution.

------
peterwwillis
For dependency management needs, 99% of what you need can be done easily with
vanilla virtualenv, vanilla pip, vanilla setup.py, and a Makefile. A small
shell wrapper that sources bin/activate and runs your Python app is all you
need to allow anyone to run your code without additional steps.

The biggest problem with Python packages isn't a dependency manager, it's that
Python developers don't reuse and extend existing projects. They just keep
churning out new similar projects with different features. All of the most
commonly used Python packages are basically one-offs that use a bunch of other
one-offs. This creates a cycle of reinventing the wheel and friction in
ramping up on existing technology, due to all the extra integration work and
split development, and later lift-and-shift into completely different
dependencies that implement the same thing. PyPI is awash with duplicate
abandoned one-offs to the point that just trying to just find a useful module
to do what you want can take a long time.

~~~
bbmario
> done easily with vanilla virtualenv, vanilla pip, vanilla setup.py, and a
> Makefile

I wouldn't call that "easy", but OK.

~~~
peterwwillis
Would you consider this easy?

    
    
      # Makefile
      venv:
       python3 -m virtualenv --version 1>/dev/null 2>/dev/null || \
        ( echo "Please install virtualenv (python3 -m pip install virtualenv wheel setuptools)" && false )
       [ -d venv.d ] || python3 -m virtualenv -p python3 venv.d
       ./venv.d/bin/pip install -r requirements.txt
    
      $ make venv
      
      # Running your project without setup.py
      $ ./venv.d/bin/python3 my-program.py
      
      # Running your project with setup.py
      $ ./venv.d/bin/pip install .
      $ ./venv.d/bin/my-program.py
    

The setup.py you can copy from another project and customize to your liking.

------
andyljones
I can't find a good comparison to Conda - is the main distinction simply that
Poetry uses the official repos?

For people working outside of scientific Python: conda is a package and env
manager maintained by a private company that's become the go-to because it's
really good at handling binary dependencies.

~~~
tduberne
Can you really build packages using solely Conda?

As far as my understanding goes, conda focuses on providing virtual
environments for interactive use, but to build and distribute a package on
conda forge, you still need to rely on setuptools/distutils.

My impression here is that Poetry aims more at _replacing_ the pip +
setuptools toolchain. Users of your package could still install it using
conda, if relevant. It seems a bit limited in what it can do at the build
step, unfortunately, so it is not a replacement yet.

Coming from the Java world, I pass my days dreaming of a "maven for python",
and this project definitely goes into that direction. I will definitely keep
it on my radar.

~~~
flyingdeertown
Well conda packages don't need to be python based, so technically yes you can
build and package up whatever your heart desires.

conda build recipes are essentially just shell scripts that conda runs in a
sandbox, taking care of library paths and so on. You could, if you really
wanted to, develop a poetry based Python package and then have your conda
build script use/run poetry to build and package the python package, and then
add whatever lines are necessary to the build recipe to make _that_ the
package that gets installed by conda.

~~~
tduberne
That basically confirms the feeling I was trying to express, in that conda and
poetry seem to solve different problems: conda focuses on easing the
distribution to the end user, whereas poetry tries to simplify managing the
build on the developer side.

------
HunOL
Worked a little with and it looks really nice and promising, but my biggest
concern is number of issues on github and that vast majority of commits is
done by one person.

~~~
epage
From what I've gathered, the project grew faster than the original dev had
time to build up other people to also monitor PRs and has since gotten very
busy, slowing down the review process. I know I have several straightforward
outstanding PRs. I hope this is able to be resolved so we can have a
sustainable community.

~~~
StavrosK
That is also my worry. I've opened issues that were show-stoppers for many
people and we couldn't even get the author to give us an update on when the
fix would be released.

I wish he'd run the project a bit better. Other than that, it's stellar
software.

------
macawfish
I like how transparent poetry is about what's happening when you run it, and
how well presented that information is. I've come to loathe pipenv's progress
bar. Running it in verbose mode isn't much better.

I can't be too mad at pipenv, but all in all poetry is a better experience.

------
dissent
Can I use this to build a package that can be uploaded to a pypi repository
for other people to depend on?

~~~
mjs2600
Yep, it works really well for that.
[https://poetry.eustace.io/docs/cli/#publish](https://poetry.eustace.io/docs/cli/#publish)

------
bbmario
How do you start a new Python project in 2019? I mean, there's so much stuff.
Poetry, pip, virtualenv... oh, my. With PHP, I just composer init and that's
it.

------
zys5945
Never used poetry before. How does it compare against pipenv?

~~~
j88439h84
Much better

------
sametmax
Love poetry, I hope it gets to feature parity with pew regarding project
naming and jumping.

------
julienkervizic
Neat, definitely something I would want to give a try.

I see though that it only supports pure python for building packages, does
that mean that it doesn't build if you are dependent on compiled libraries?

Is there also a plan to add some of the functionality of bundling tools such
as web-pack into this build phase? like automated css optimization, image
compression... Could be handy for some django/flask projects.

------
j88439h84
Poetry is very good. I think projects should use it.

I hope the rest of the ecosystem can catch up quickly.

Tox and pip and and pex need full support for PEP 517/518.

~~~
sametmax
They can't support reading the poetry data because it's not generic. Nor are
most practical implementations of tools using pyproject.toml because they all
use custom tool-namespaced fields in pyproject.toml: it's far from a
standardized format yet, despite the push for using it from the people that
made it.

I still wish poetry would support setup.cfg in parallel. It has been working
for 2 years, is compatible with pip, setuptools and the whole legacy stack,
and most fields are standards and documented.

~~~
j88439h84
I should be more specific about what I meant.

I would like to see tox support `install_commands` ([https://github.com/tox-
dev/tox/issues/715](https://github.com/tox-dev/tox/issues/715))

I would like to see pex support the `build-system` feature.
([https://github.com/pantsbuild/pex/issues/660](https://github.com/pantsbuild/pex/issues/660))

For pip, I'm not sure what's involved in getting everything to work
generically. This is probably not simple. It has the consequence that when we
don't have control over the install command, we end up needing hacks like
using `extras` for ReadTheDocs (e.g.
[https://github.com/readthedocs/readthedocs.org/issues/4912#i...](https://github.com/readthedocs/readthedocs.org/issues/4912#issuecomment-483756581))

------
ilovecaching
Python is the worst thing to happen to SWE (specifically SWE, not ML,
Education, etc).

Python makes huge sacrifices in readability and efficiency for a hypothetical
win in writeability. It's also fundamentally at odds with the multicore world
we're living in an will continue to live in, an issue that many people have
tried to fix and failed at.

I can't count the number of times I've had to spend my entire night reading
through an older Python service without type annotations, desperately trying
to understand the types of inputs and outputs. It's just a mess, and the bias
of programmers to believe that they are writing readable code (when in reality
it's only readable to them) exacerbates the use of names that provide little
context to solve the problem.

Python is awful, and it's an uphill battle to fix it. Asyncio can solve some
performance issues, but negates the benefit of Python's simplicity by
destroying the sequential mental model that Programmers can easily grok. It
also requires a complete rewrite of libraries and a split between non-asyncio
land and asyncio land.

Type annotations can make Python more readable and thus more maintainable, but
gradual typing still leaves lots of unanswered questions when you're
interfacing with untyped code.

Python is really not good unless you are using it to prototype or build
something on your own. It's led to a world of slow, buggy software that is
impossible to rewrite. It's downsides are easy to measure, but its benefits
are difficult to quantify.

~~~
j88439h84
Http://trio.rtfd.org replaces asyncio with a model that feels much closer to
sequential. And it's very good.

------
oweiler
Looks great! So, dependency management in Python is now a solved problem?

~~~
flixic
Seems to be solved only five or six times; not nearly enough.

------
markandrewj
I was excited by this project, and pipenv, but unfortunately I haven't had
consistent results cross platform with either. I have ended up sticking with
venv because of this.

------
pbreit
Could an effort like this ever be implemented as an (backwards compatible)
extension to or modification of something existing with traction?

------
Areading314
This looks awesome, well done! Is there a TLDR about how it improves on pip or
other python dependency management?

~~~
LaundroMat
The GitHub readme explains how it improves on Pipenv.

I like Poetry a lot; I hope platforms like Heroku will start supporting it
soon.

~~~
nickserv
How is the execution time when resolving a non-trivial project? Like your
typical Django app, for example.

~~~
LaundroMat
Adding dependencies takes a (little) bit of time, because of the lockfile. But
deployment (ie installing dependencies) is as fast as any dependency manager
(pip, pipenv, ...).

------
heyoni
Congrats on hitting version 1.0! I’ve been putting off completely switching
off of pipenv until this milestone.

~~~
LaundroMat
I'm not being snarky, it's probably just too early in the morning, but where
did you see the project hit v1.0?

~~~
heyoni
I am Le idiot. I saw a version 1 email update but it’s for the prerelease >_<

------
fouc
Does anyone know if there's a name for the "terminal" theme used on the poetry
site?

------
ausjke
what poetry can do that venv can not do? considering venv is the default tool
and is solid and fairly easy to use already?

~~~
dragonwriter
> what poetry can do that venv can not do?

Venv doesn't do dependency management, it just provides an isolated python
environment. By default, poetry uses venv for environment isolation, but also
does dependency management.

~~~
ausjke
what about pip freeze to requirements.txt for manually but solid dependency
management under venv? pip takes care of dependencies be it venv or poetry,
requirements.txt will record what's need so you make your project reproducible
and portable. what else does poetry bring to the table? If there is a strong
selling point I would like to try poetry again.

yarn came out great but these days I switched back to npm for the sake of
simplicity of my workflow, especially npm absorbed yarn's good features.

same thing happens to parcel/webpack, the neat new tool beats the old players,
but then the old dominant one catches up by taking in good stuff from its
smaller competitors, quickly.

