It manages also the lock file, and resolve deps better than pip.
At last, it lets you put all meta data in one file, including dev deps, then build a wheel.
You can do that with regular setuptool and setup.cfg and it's compatible with pip though.
Poetry also updates the files properly at each install, so you don't have to do it. And it can't install something outside of a venv by mistake.
But yes, a requirement file is ok in many cases. I use them myself often.
thanks for the clarification !
Sure - it depends on what you're doing. Requirements.txt won't work if you're writing a library; you'll need to move (or duplicate) your dependencies into setup.py.
However, the benefit of moving to setup.cfg (not setup.py anymore, which should be almost empty) is more than making a lib. It makes deployment easier, and dev as well. Hell, you can even pip install from git repo with it.
I made a summary of what to use for what, when and how in this comment: https://news.ycombinator.com/item?id=20132303
Meanwhile, the former CTO of NPM released a new (alpha) package management system last week at JSConfEU (the conference where NodeJS was first announced); see https://github.com/entropic-dev/entropic/blob/master/docs/RE...
From the dev user or packager point of view, no.
First, there is no namespace in JS. Secondly, the import instruction standard is not implemented everywhere, so we need a bundler. First, the network makes everything harder. And finally, JS has virtually no stdlib.
Add to that that we we are stuck in npm land, with acute lefpadite, packages breaking the public API every sunday, and a mad hatter packager like webpack as our lord and savior, and you have a recipe for terrible times.
Here is what you need to know: https://news.ycombinator.com/item?id=20132303
Honestly, it's not much. It's enough to live a happy life as a Python dev.
I keep seeing pipenv and poetry and wonder how all these came about? Is it because people come from other languages and want to do this the way they did it in those?
Here is everything you need to know about it in a 7 minute read.
- it uses only pyproject.toml. Despite the current com on this format, it's not stable, it's an incomplete standard and it's not well supported by the ecosystem. Setup.cfg is a much better alternative in the mean time. In fact, just the auto include features make it better.
- there is no nice way to install it. Pip install poetry is the usual one, and it got many gotchas for beginers, which are the ones that would benefit poetry the most.
No poetry necessary, but I point to when I use it anyway.
I am stunned when I do composer update, because it basically updates every dependencies.
When I asked one of the seniors, he said yea, I have to check every single part of the website and make sure it doesn't break.
I don't know anymore.
That's what unit/integration/e2e tests are for.
How do I package my Python program for small scale distribution to non-technical people?
The end target is usually Windows so I suggest PyInstaller but it doesn't feel exactly elegant although it is better than py2exe.
Maybe there is some alternative on the horizon?
- skip the explanation on the py command on windows, the version suffixes on unix, -m and why you need to install pip on linux but not on the other OSes.
- doen't address the various sys path issues of pip and poetry. Because at some point you need to install peotry.
- ignore the existence of the excellent and simple setup.cfg.
- ignore the consequences of using poetry on IDE setup, tox or CI.
Python packaging is not hard anymore. But the information you get out there is incomplete and assume some kind of basic sysadmin experience.
Would you tell us which is "the right way to do it" nowadays? Possibly, in a maintainable, kind-of-officially supported way that doesn't change or disappear in a few months?
Please note: I use Python professionally since 2005, I've been involved a lot in Python packaging for production apps (including giving some talks on the bad state of Python packaging at Europython around 2010) and I had followed closely the then-failed distutils2 effort. And I still don't know what's the "right and easy way to do it".
Packaging, the easy way
Because I'm not on a blog, I can't go too much into details, and I'm sorry about that. It would be better to take more time on each point, but use them as starting point. I'll assume you know what virtualenv and pip. If you don't, check a tutorial on them first, it's important.
But I'm going to go beyond packaging, because it will make your life much easier. If you want to skip context, just go to the short setup.cfg section.
1 - Calling Python
Lots of tutorials tell you to use the "python" command. But in reality, often several versions of Python are installed, or worst, the "python" command is not available.
If the python command is not available, uninstall Python, and install it back again (using the official installer), but this time making sure that the "Add Python to PATH" box is ticked. Or add the directory containing "python.exe", and its sibling "Scripts" directory to the OS system PATH manually (check a tutorial on that). Restart the console.
Also, unrelated, but use a better console. cmd.exe sucks. cmder (https://cmder.net/) is a nice alternative.
Then, don't use the Python command on Windows. Use the "py -x.y" command. It will let you choose which version of Python you call. So "py -2.7" calls python 2.7 (if installed) and "py -3.6" calls Python 3.6. Every time you see a tutorial on Python telling you to do "python this", replace it mentally with "py -x.y".
UNIXES (mac, linux, etc):
Python is suffixed. Don't just call "python". Call pythonX.Y. E.G: python2.7 to run python 2.7 and python3.6 to run Python 3.6. Every time you see a tutorial on Python tell you to do "python this", replace it mentally with "pythonX.Y". Not PythonX. Not "python2" or "python3". Insist on being precise: python2.7 or python3.5.
pip and virtualenv are often NOT installed with Python, because of packaging policies. Install it with your package manager for each version of Python. E.G: "yum install python3.6-pip" or "apt install python3.6-venv".
FINALLY, FOR ANY OS:
Use "-m". Don't call "pip", but "python -m pip". Don't call "venv", but "python -m venv". Don't call poetry but "python -m poetry." Which, if you follow the previous advices, will lead to things like "python3.6 -m pip" or "py -3.6 -m pip". Replace it mentally in tutorials, including this one.
This will solve all PATH problems (no .bashrc or windows PATH fiddling :)) and will force you to tell which python version you use it with. It's a good thing.
In any case, __use a virtualenv as soon as you can__. Use virtualenv for everything. One per project. One for testing. One for fun. They are cheap. Abuse them.
In the virtualenv you can discard all the above advices: you can call "python" without any "py -xy" or suffixes, and you can call "pip" or "poetry" without "-m". Because the PATH is set correctly, and the default version of Python is the one you want.
But there are some tools you will first install outside of venv, such as pew, poetry, etc. For those, use "-m" AND "--user". E.G:
"python -m pip install poetry --user"
"python -m poetry init"
2 - Using requirements.txt
You know the "pip install stuff", "pip freeze > requirements.txt", "pip install -r requirements.txt" ?
It's fine. Don't be ashamed of it. It works, it's easy.
I still use it when I want to make a quick prototype, or just a script.
As a bonus, you can bundle a script and all it's dependencies with a tool named "pex" (https://github.com/pantsbuild/pex):
pex . -r requirements.txt -o resulting_bundle.pex --python pythonX.Y -c your_script.py -f dist --disable-cache
3 - Using Setup.cfg
At some point you may want to package your script, and distribute it to the world. Or maybe just make it pip installable from your git repo.
Let's say you have this layout for your project:
from setuptools import setup; setup()
name = your_package
version = attr: your_package.__version__
description = What does it do ?
long_description = file: README.md
long_description_content_type = text/md
author = You
author_email = email@example.com
url = https://stuff.com
classifiers = # not mandatory but the full list is here: https://pypi.org/pypi?%3Aaction=list_classifiers
Intended Audience :: Developers
License :: OSI Approved :: MIT License
Programming Language :: Python :: 3.5
Programming Language :: Python :: 3.6
Programming Language :: Python :: 3.7
Topic :: Software Development :: Libraries
packages = your_package
requests>=0.13 # or whatever
* = *.txt, *.rst
hello = *.msg
dev = pytest; jupyter # stuff you use for dev
[options.package_data] # non python file you want to include
* = *.jpg
Setup.cfg has been supported for 2 years now. It's supported by pip, tox, all the legacy infrastructures.
Now, during dev you can do: "pip install -e root_project_dir". This will install your package, but the "-e" option will make it work in "dev mode", which allow you to import it, and see modifications you did to the code without reinstalling it every time. "setup.py develop" works too.
If you publish it on github, you can now install your package doing:
pip install git+https://github.com/path/to/git/repo.git
python setup.py bdist_wheel
Anybody can then "pip install your_package.whl" to install it. Mail it, upload it on an ftp, slack it...
If you want to upload it on pypi, create an account on the site, then "pip install twine" so you can do:
twine upload dist/*
You could use "python setup.py bdist_wheel upload" instead of twine. It will work, but it's deprecated.
4 - Using pew
Pew (https://github.com/berdario/pew#usage) is an alternative to venv, poetry, virtualenvwrapper and pipenv.
It does very little.
"pew new env_name --python python3.X"
"pew workon env_name"
That's all. It's just a way to make managing virtualenv easier. Use pip as usual.
You can know where your virtualenv has been created by looking up the $VIRTUAL_ENV var.
This is especially useful for configuring your IDE, although I tend to just type "which python" on unix, and "where python" on Windows.
5 - Using poetry
Now, if you need more reliability, poetry enters the game. Poetry will manage the virtualenv for you, will install packages in it automatically, will check all dependencies in a fast and reliable way (better than pip), creates a lock file AND update your package metadata file.
I'm not going to enter into details on how this work, it's a great tool, with a good doc: https://github.com/sdispater/poetry
You don't need to start with poetry. You can always migrate to it later. Most of my projects do the "requirements.txt" => "setup.cfg" migration at some point. Some of them move to poetry if I need the extra professionalism it provides.
The problem with poetry is that it's only compatible with poetry. It uses the pyproject.toml format, which is supposedly standard now, but is unfinished. Because of this: any tool using it, including poetry, actually stores most data in custom proprietary fields in the file :( Also, it's not compatible with setuptools, which many infrastructures and tutorials assume. So you'll have to adapt to it.
That being said, it's a serious and robust tool.
6 - If you want to compile Python to an exe, use nuitka (http://nuitka.net/), not p2exe, cx_freeze and co.
Nevertheless, if you want to have one, just add this to setup.cfg
I will have to open an english on I guess.
The way forced a half baked pyproject.toml format to be a standard and refused to hear any complaint, all that while there have been a working open format existing, and working for 2 years was not very professional.
The author states that poetry is more popular than pipenv, is that true?
I did a quick comparison:
https://github.com/sdispater/poetry watchers: 77 stars: 4,690 forks: 318
https://github.com/pypa/pipenv watchers: 348 stars: 17,207 forks: 1,268
I don't know if the authors statement 'Most people seem to prefer Poetry.' is true.
I wouldn't say more popular, but it's certainly gaining traction in a relatively short time and I prefer it. I haven't used it in anger but it certainly seems faster that Pipenv.
 - https://vorpus.org/blog/why-im-not-collaborating-with-kennet...
 - https://www.reddit.com/r/Python/comments/8kjv8x/a_letter_to_...
edit: sorry I misunderstood, the creator of pipenv doesn't hate pipenv
The consistent problem with all these systems is that when (not if) an install fails, figuring out what to do is very difficult.