We have moved our dependency management from pipenv to poetry and hasn’t been more happier than now working with and publish Python modules.
Now we use pyenv with poetry to run all our projects in Python with venv in project instead of decided by poetry default settings. We changed the settings with:
After this in hgignore or gitignore added .venv folder.
Now everything is the way we wanted, with Pipenv many times faced issues that the project worked with requirements.txt but not in Pipenv. Didn’t have this issue moving to poetry yet.
Also with emacs support and automated deployment tools the predictability of poetry made our team easy to work with and publish Python application modules.
Nice. I do however much prefer keeping the venv out of the source tree where they do not interfere with `tree` and `grep` and whatnot. I'm curios what benefit you see with having the venv cluttering the work-space ?
We setup systemd with gunicorn or uwsgi start in the project venv. So when they are created inside the project folders is easy to have the automated deployment for Python softwares easy with systemd services or local scripts.
Our developers use mac, windows and Linux for development and each one have different location of venv with poetry, if we start gunicorn or uwsgi they may make mistakes. With predictable venv location all can work the same way from within project directory.
If you use a grepping tool like ripgrep it will automatically ignore everything in your `.gitignore`, so at least for that specific case you're covered.
Your app is self contained in a single folder (not 2), run scripts are simpler (can use relative paths like ./venv/ Scripts/foo, no need to “activate”...
Didn't work. I don't have python installed, just python3. Fair enough.
- Using the above method, but replacing `| python` with `| python3`: The installation works. But trying to invoke poetry gives this error:
/usr/bin/env: ‘python’: No such file or directory
Turns out poetry executable is calling python in the shebang.
- Manually changed shebang in `$HOME/.poetry/bin/poetry` to `#!/usr/bin/env python3`: Now poetry runs successfully.
I mean, it works, but it is fragile (shebang probably needs to be manually reset at every update), not beginner-friendly, and not ergonomic. Installing using pip3 has the same issue. In comparison, pipenv is much easier to install. I was hoping this issue would not be present for poetry v1.0, but unfortunately it is.
Curious what's stopping you from symlinking python3 to python? In fact, I'm surprised the package manager didn't already do that (I thought it did for Debian and its derivatives, including Ubuntu). Does WSL throw a wrench in that or something?
The problem is exactly with you having `python3` but not `python`. The convention is to use them both only if you have both Python versions 2 and 3; otherwise there should always be a `python` command on your system. If you insist on having only `python3` you will keep running into problems.
My recommendation is to either create an alias for `python3`, or even better use `pyenv`: https://github.com/pyenv/pyenv
Poetry tries to help you manage your whole project while flit is just for building your library (and optionally uploading; you can always use twine to do the upload). It's all-in-one versus one-job-one-tool.
I still don’t feel hungry for new dependency management solutions (requirements.txt + virtualenv works reliably for me) but I do like the idea of refactoring setup.cfg and setup.py for simpler PyPI publishing.
While I'm excited that Poetry hit 1.0, I'm actually more excited about the fact more devs were added to the maintenance team and that it transitioned to a project namespace. Sébastien was previously the sole dev with commit access so I'm hopeful this gives the project a much healthier future.
The project org has two (public) members and the second highest contributor has 10 commits with <600 lines. It would inspire more confidence if the project is transferred (invited) to pypa some day.
Same here. I got really excited by pipenv, but every single time I've tried it, it takes 10+ minutes to determine the dependencies and build a lock file (see issue 2284 [1]). It just makes it un-usable for a lot of my use-cases, e.g. setting up a clean environment for exploratory data analysis. Does anyone know if poetry improves in this area?
Yes, I haven't encountered any of the slowness in dependency resolution that Pipenv has with Poetry. It does still feel a bit slow compared to package managers from other languages, but not in a buggy way, so I would attribute it to not being that optimized yet.
Same here. Been using Poetry for a few months, and it's been a blast!
Before that I tried to use pipenv, but it's been too buggy and unpredictably slow with installs. I've always had bad experiences with virtualenv etc., and mostly stayed away from Python development due to that. With Poetry that has changed immensely, as I now have the reproducibility in environment I'm used to from Rust/Ruby/Node.
All that said, I think there is still quite a way to go for Poetry. E.g. I had (hard to reproduce) bugs where if the install was canceled, I had to purge the whole virtualenv, as it was unable to recover itself. I'm pretty opimistic about the project as a whole though!
hopefully v1.0 now means it's pretty stable. using poetry so far has been painful, the gains you got with the improved dependency resolution were unfortunately somewhat impacted by crazy annoying bugs, like `poetry shell` not actually activating the virtual environment [0]
$ poetry env use /usr/local/bin/python3
[NoCompatiblePythonVersionFound]
The specified Python version (3.7) is not
supported by the project (^3.7).
Please choose a compatible version or loosen the python constraint specified in the pyproject.toml file.
$ poetry env use /usr/local/bin/python3
[NoCompatiblePythonVersionFound]
The specified Python version (3.7) is not
supported by the project (>=3.7).
Please choose a compatible version or loosen the python constraint specified in the pyproject.toml file.
For whatever reason, I've never understood the point or actual benefit of virtualenvs.
Having lived more in the JS ecosystem for the last several years, my ideal workflow would be a copy of how the Yarn package manager works:
- Top-level dependencies defined in an editable metadata file
- Transitive dependencies with hashes generated based on the calculated dependency tree
- All dependencies installed locally to the project, in the equivalent of a `node_modules` folder
- All package tarballs / wheels / etc cached locally and committed in an "offline mirror" folder for easy and consistent installation
- Attempting to reinstall packages when they already are installed in that folder should be an almost instantaneous no-op
PEP-582 (adding standard handling for a "__pypackages__" folder) appears to be the equivalent of `node_modules` that I've wanted, but tools don't seem to support it yet. I'd looked through several Python packaging tools over the last year, but none of them seemed to support it yet (including Poetry [0]).
The only tool that I can find that really supports PEP-582 atm is `pythonloc` [1], which is really just a wrapper around `python` and pip` that adds that folder to the path. Using that and `pip-tools`, I was able to mostly cobble together a workflow that mimics the one I want. I wrote a `requirements.in` file with my main deps, generated a `requirements.txt` with the pinned versions and hashes with `pip-compile`, was able to download and cache them using `pip`, and installed them locally with `piploc`.
Admittedly, I've only tried this out once a few weeks ago on an experimental task, but it seemed to work out sufficiently, and I intend to implement that workflow on several of our Python services in the near future.
If anyone's got suggestions on better / alternate approaches, I'd be interested.
> - All dependencies installed locally to the project, in the equivalent of a `node_modules` folder
That's basically what a virtualenv is, the python version of a node_modules directory. You just need to explicitly activate/deactivate to move in and out of that environment, instead of it being based on the directory tree. And the requirements file is equivalent to the "dependencies" section(s) of package.json
Node and npm are no better than JavaScript tools used for dependency management. I had a project in node 6.0 refuse to work with node 8 and npm failed to upgrade. No different than python 2.7 to Python 3.
Just browser and JavaScript silence the errors and might result in things cannot be seen. This is not the case with Python “No errors must pass silently unless explicitly silenced”.
Totally agree; node_modules starts to break horribly when you use the wrong node/npm command to access it. This is unfortunately causing a lot of misconceptions because people usually judge tools by short-term experiementations. node_modules works very well in that scenario and gets the most love--until it doesn’t, you tear hair for those nasty bugs, and sadly can’t even convince people with your experience (“it works perfectly for me, I don’t know what you’re talking about.”)
I guess the node_modules approach is fine for Node, since the native binding story is quite bleak and no very people are deep into it anyway. But for Python this would be a complete no-go given how pervasive extension modules are. I don’t know what went through Ian Bicking’s mind when he first designed virtualenv, but so many aspects of it just reflects that tight cross-language integration Python is so good at (the bin/include/lib structure, copying/symlinking binaries into place, setting up env vars to trick foreign build systems into working, etc.). I wouldn’t want it any other way even if I could go back in time and involve in its original design. Well, maybe a few minor improvements, like offer a more beginner-frieldly interface than activate scripts, or more heavily promote running the binaries directly, but the design is just… correct.
Does anyone know if the situation with using `poetry` with `pyenv` has gotten any better? I like poetry's approach, but I had a very difficult time getting it to play nice with my system last time I tried.
What's the advantage of using poetry compare to vendorize your dependencies and check in the whl files (which can be manylinux binary) into your version management system?
Then you can modify your script to insert every whl in sys.path. Add wheelhouse directory to version control and you get a repo that can easily reproduce anywhere that manylinux is supported.
Can you define dependencies loosely and still have a lock file in Anaconda so when you come back to the project a year later you can install the exact same dependencies at the same versions? (Honest question, not familiar with Anaconda.)
You can do both!
Your general file can loosely define dependencies like greater than this version, less than this version, skip this version, etc
Then you “lock” and it builds a dependency tree with the newest version of everything that satisfies your loose requirements.
Then you can choose to install dependencies
from your lockfile, which is exact frozen versions OR you can re-run the lock later, which will again attempt to update everything to the latest versions that satisfy your loose constraints
Another one: https://github.com/facebookincubator/xar "XAR lets you package many files into a single self-contained executable file. This makes it easy to distribute and install... A popular example is Python application archives that include all Python source code files, as well as native shared libraries, configuration files, other data."
Have a repo for your code, vendorize every thing, build a binary including Python interpreter using a build system that emphasize on reproducibility (e.g. Bazel).
Alternatively you can build a docker image using Bazel (for reproducibility) and use tools like Droot to chroot into the image then run.
Is it possible to build a wheel for a unix platform on a windows machine, when the package and some of its dependencies have compiled portions? I'm trying to build a wheel with deps for exchangelib and have been struggling.
I thought the billions ways to sort out dependencies and environments Python were its messiest part. I've been using a custom little batch script on windows to streamline that a bit as I mainly use python for small utilities as opposed to big projects - https://gist.github.com/leoncvlt/10e67d9415e61eff0f5010ef6fe... - but interested in giving this a spin!
Just to note, dev dependencies in Poetry is still rather flaky vis-à-vis projects forever in pre-release mode e.g. Black. Consider sticking with Pipenv if you have heavy dev dependencies requirements.
I have been bitten by this. I love Black to death, but I don't understand why they're still tagging their project as bêta when it's been so stable over the last year, at least.
Anyway, the solution is simple: put
black = { version = "*", allows-prereleases = true }
in your pyproject.toml file. Or use --allow-prerelease
as suggested by others.
Messing around with major/minor Python version compatibility and dependency conflicts has always been my least favorite part of Python. A tool like this is much appreciated!
Big up to dephell (https://github.com/dephell/dephell) which makes it pretty straightforward to convert from one dependency management system to another, while the dust settles.
Now we use pyenv with poetry to run all our projects in Python with venv in project instead of decided by poetry default settings. We changed the settings with:
After this in hgignore or gitignore added .venv folder.Now everything is the way we wanted, with Pipenv many times faced issues that the project worked with requirements.txt but not in Pipenv. Didn’t have this issue moving to poetry yet.
Also with emacs support and automated deployment tools the predictability of poetry made our team easy to work with and publish Python application modules.