
Python's New Package Landscape - teajunky
http://andrewsforge.com/article/python-new-package-landscape/
======
whitehouse3
While pipenv has garnered a lot of attention and praise for ease of use, it
falls over whenever I integrate it with any serious work. Pipenv lock can take
20-30 minutes on a small flask app (~18 dependencies). And it often mixes up
virtualenvs, enabling the wrong one with seemingly no remedy. I see the
problems on Windows, MacOS and Ubuntu. 2018 is not the year of pipenv, for me.
I'm sticking with regular virtualenvs and the manual-hell of requirements.txt.
I hope it gets better eventually.

~~~
prepend
Am I weird that I use anaconda instead of virtualenvs? I guess it’s overkill
if you aren’t using the other conda features.

~~~
Insanity
I tend to only use anaconda for "data science work" and not for my small side
projects.

I "feel" like it is overkill to use anaconda for things unrelated to 'data
science' and the likes, but I'm not sure why I feel that way.

It kind of makes sense to use for other projects as well since you don't need
to import all the things conda offers.

~~~
azag0
For one, as a package developer, publishing a source distribution of a package
on PyPI is almost trivial. Publishing on Anaconda Cloud requires you to build
the binary packages on all the OS's that you want to support (and for all
Python versions you want to support) which most people delegate to some CI. So
there is a whole new level of complexity involved.

~~~
xapata
You can use pip-install from within a conda environment. Conda isolates better
than virtualenv does, IMHO.

------
cik
We just went through this cycle - ultimately we build packages (debs) and
dockers, for deployment within VMs. Our build process - depending on the
component pushes the deb to repos, or uses the deb in the docker.

After trying to replace pip with Pipenv, we had to stop. The dependency
resolution time for 20 declared dependencies (that in turn pull down > 100
components) takes well over 5 minutes. With poetry - it takes less than 33
seconds on a clean system. The times are consistent for both Ubuntu 16.04 and
Mac OS X.

Our only goal is to get to the point we're now in - tracking dependencies, and
separate dev requirements (like ipython and pdbpp) from our other
requirements. Poetry made it fast, simple, and made me an addict.

Over two days, I moved our entire codebase and every single (active) personal
project I had to poetry. I don't regret it :)

~~~
abalaji
I can second this... poetry is a seriously good project and matches PEP
standards for the pyproject.toml, you will definitely see more projects
jumping onto that standard soon.

~~~
ihumanable
Sébastien Eustace is a really awesome developer, I've used Pendulum and Orator
as well as Poetry for projects, the documentation is beautiful and complete,
the APIs are well thought out, and if you find an issue contributing back to
the project is simple and straightforward (I'm happy to have a few PRs
accepted into Orator).

There's a certain joy working with tools when it's clear that the person
making those tools actually cares about the developer and making it work well.

------
scrollaway
Published May 11th, 2018. But it's interesting it's popping up again. It's a
good explanation of the landscape as of 2018, though Pipenv has since gone in
a weird direction. There's a lot of recommendations for it, but I sometimes
get the feeling people don't understand what they're recommending, such as
replacing some things that work (setup.cfg) by things that don't do the same
thing (Pipfile).

Man, the Python packaging ecosystem is one of those things which really bring
me down regarding the state of Python, because there is such an extremely high
barrier for breaking backwards compatibility _and_ nothing really works.

The JS ecosystem is far better in this regard. Pipenv was most promising
because it followed in Yarn's footsteps, but it didn't go all the way in
replacing pip (which it really should have). So now there's still a bunch of
stuff handled by pip, which pipenv does not / cannot know about, and this
isn't really _fixable_.

The end result is that instead of telling people about pip + virtualenv, we
now have pip, virtualenv and pipenv to talk about. And people who don't
understand the full stack, and the exact role of each tool, can't really
understand how to properly do the tasks we choose to recommend delegating to
each one of them.

There's three separate-but-related use cases:

\- "Installing a library" (npm install; pip install).

\- "Publishing a library" (setup.py. Or Twine if you're using a tool. Both use
setuptools.).

\- "Deploying a Project", local dev or production (pipenv. Well, if it's
configured with a pipfile, otherwise virtualenv, and who knows where your
dependencies are, maybe requirements.txt. Pipenv does create a virtualenv
anyway, so you can use that. Anyway you should be in docker, probably. Make
sure you have pip installed systemwide. Yes I know it comes with python, but
some distributions remove it from Python. Stop asking why, it's simple. What
do you mean this uses Python 3.6 but there's only Python 3.5 available on
Debian? Wait, no, don't install pyenv, that's not a good idea! COME BACK!)

The JS ecosystem manages to have two tools, both of which can do all of this.
I don't know how we keep messing up when we have good prior work to look at.

~~~
acdha
> \- "Deploying a Project", local dev or production (pipenv. Well, if it's
> configured with a pipfile, otherwise virtualenv, and who knows where your
> dependencies are, maybe requirements.txt. Pipenv does create a virtualenv
> anyway, so you can use that. Anyway you should be in docker, probably. Make
> sure you have pip installed systemwide. Yes I know it comes with python, but
> some distributions remove it from Python. Stop asking why, it's simple. What
> do you mean this uses Python 3.6 but there's only Python 3.5 available on
> Debian? Wait, no, don't install pyenv, that's not a good idea! COME BACK!)

This makes the situation sound a lot more complex than it actually is by
conflating separate layers: the system distribution issue is exactly the same
for both Python and JS (if Debian ships an old v8 you either need to install a
new one, perhaps using Docker to make that easy and isolated). Similarly, the
question of whether you install the app using pip or pipenv is a different
layer from whether you're using Docker or not, just as Docker is unrelated to
the question of whether you use npm or yarn.

For a new project in 2018, you can simply say “Use pipenv. Deploy in Docker,
using pipenv.” and it works as well as the JS world. People sometimes choose
to make their projects too complicated or to manage things at the wrong level
but that's a social problem which is hard to solve with tooling.

~~~
erichurkman
One difference: large swaths of Python developers grew up using the system-
provided version of Python.

Most Node developers grew up with

    
    
        curl -o- https://raw.githubusercontent.com/creationix/nvm/v0.33.11/install.sh | bash
    

or

    
    
        brew install node
    

or using one of the dozen other ways to install node. Distinct versions and
per-project packages were the norm from day one. That was not true with
Python.

------
andybak
In case this scares any new users, I've used nothing more than pip and
virtualenv for several years with no issues of note.

~~~
philosopherlawr
There are lots of errors when it comes to reproducing the build on other
machines. Pip install -r requirements.txt does not guarantee that you will
install the same version of packages on a new machine, and in fact, you will
typically not.

~~~
jabwork
I've never had problems with this when requirements.txt contains package
versions

Have you, or are you not using explicit versions supplied by eg pip freeze?

------
sambe
Whenever talk in Python-world goes towards packaging, I feel like I have been
transported to Javascript-world: it's never clear to me what concrete problems
are being solved by the new tools/libraries.

This article seems well-written and well-intentioned. Despite reading it, I
don't know why I would not have loose dependencies in setup.py and concrete,
pinned dependencies in requirements.txt. It's never felt hard to manage or to
sync up - the hard part is wading through all the different tools and
recommendations.

~~~
lmm
> loose dependencies in setup.py

How does that work? How would someone else coming to work on your project use
them?

> concrete, pinned dependencies in requirements.txt

How do you maintain that requirements.txt? And while that might work for
applications, what do you do for libraries?

~~~
sambe
I assume that someone working on the project would do:

    
    
      pip install -e .
    

in a virtual environment. I thought this was quite well-established. Is there
a problem with it that I'm not aware of?

    
    
      pip freeze > requirements.txt
    

for requirements.txt generation. For libraries just omit this? I'm not sure I
understand the question. The article also mentions that several of the new
tools aren't appropriate for libraries anyway.

~~~
lmm
> I assume that someone working on the project would do: pip install -e . in a
> virtual environment. I thought this was quite well-established. Is there a
> problem with it that I'm not aware of?

So ignoring your requirements.txt, and potentially working with different
versions of dependencies from the ones you were working with and encountering
different bugs?

(Also managing your virtual environments "by hand" is tedious and error-prone
when you're working on multiple projects).

> pip freeze > requirements.txt for requirements.txt generation.

The problem with this is that it's not reproducible - if two people try to run
it they might get different results, and it's not at all obvious who should
"win" when the time comes to merge. If you mess up the merge and re-run then
maybe you get a different result again, and have to do all your testing etc.
over again.

> For libraries just omit this?

Maybe, but then you'll face a lot of bug reports from people who end up
running your library against different versions of upstream libraries from the
ones that you tested against.

~~~
sambe
People working on your project have the choice of using the requirements.txt
or not. I would think core developers use the loose dependencies, with the aim
of testing the latest and fixing the bugs. Someone has to move dependencies
forward at some point, and doing this locally for knowledgeable people seems
reasonable. CI should definitely - and part time contributors should probably
- just use the pinned dependencies.

This is why I would not worry about pip freeze being non-reproducible. It is a
manual step: upgrade our dependencies. Testing should happen all the time. If
you are happy with the result of testing after upgrading dependencies, commit
requirements.txt. I don't see new tools easing the burden of co-ordinating and
testing dependency upgrades. Did I misunderstand them in this context?

I don't understand the concern for the library case. Pipenv doesn't address
libraries. It seems to be an explicit goal of many people not to pin library
dependencies. I'm asking what the new tools are solving - and again I can't
see that they are solving this. Nothing is preventing you from pinning your
library dependencies if you want (using old tools) but you'll probably get
people complaining about being incompatible with other projects.

~~~
lmm
> I would think core developers use the loose dependencies, with the aim of
> testing the latest and fixing the bugs. Someone has to move dependencies
> forward at some point, and doing this locally for knowledgeable people seems
> reasonable.

Agreed that developers should be moving the dependencies forward, but you want
to do that as a deliberate action rather than by accident. E.g. if you want to
consult another developer about a bug you're experiencing, you want them to be
on the same versions of dependencies as you.

> This is why I would not worry about pip freeze being non-reproducible. It is
> a manual step: upgrade our dependencies. Testing should happen all the time.

It's a manual step, but you still want to be able to reproduce it. E.g. if a
project is in maintenance mode, you want to be able to do an upgrade of one
specific dependency without having to move onto new versions of everything
else.

I don't work in Python any more so I don't know what the new tools do or don't
do, I was just starting from your "I don't know why I would not have loose
dependencies in setup.py and concrete, pinned dependencies in
requirements.txt." and I know that workflow gave me a number of problems that
I simply don't have when working in other languages. So I'm hoping that Python
has caught up with the things that are known-working elsewhere, but maybe not.

------
Bogdanp
Having used pure pip + virtualenv{,wrapper}, pip-tools + virtualenv, poetry
and Pipenv for medium to large applications, I'm going to be sticking to pip-
tools for the time being for apps. Poetry is fine, but pip-tools is faster and
there's less to learn. Pipenv is unbearably slow for large applications and
often buggy.

For libraries, I've been using Poetry for molten[1] and pure setuptools for
dramatiq[2] and, at least for my needs, pure setuptools seems to be the way to
go.

[1]: [https://github.com/Bogdanp/molten](https://github.com/Bogdanp/molten)

[2]:
[https://github.com/Bogdanp/dramatiq](https://github.com/Bogdanp/dramatiq)

~~~
epage
The problem I ran into with pip-tools (and I assume pipenv has this) is that
the lock files are platform and python-version specific (evaluates all of the
conditionals for the given platform) when I do a lot of stuff cross-platform.

------
phren0logy
I'm not a programmer by trade, but I dabble, and these issues make it much
less fun.

In my limited experience, Clojure's Leinengen is a far more pleasant way to
solve these problems. I'm sure there are many other examples in other
languages, but in the few I've used, nothing comes close. Each project has
versioned dependencies, and they stay in their own little playground. A REPL
started from within the project finds everything. Switch directories to a
different project, and that all works as expected, too. It's a dream.

[[https://leiningen.org/](https://leiningen.org/)]

------
CMCDragonkai
I've tried a lot of solutions, but nix-shell hands down is the best I've used.
I wrote a little gist detailing how to develop in python using a Nix:
[https://gist.github.com/CMCDragonkai/b2337658ff40294d251cc79...](https://gist.github.com/CMCDragonkai/b2337658ff40294d251cc79d12b34224)

------
samwillis
Not really packaging but related, my favourite new tool is Pyenv
([https://github.com/pyenv/pyenv](https://github.com/pyenv/pyenv)) it made
getting a new laptop setup with various versions of Python so much quicker.

I haven’t used Pipenv yet but it works with pyenv to create virtual envs with
a specified puthon version as well as all the correct packages.

~~~
rhizome31
Polyglots might also be interested in asdf ([https://github.com/asdf-
vm/asdf](https://github.com/asdf-vm/asdf)). It's like Pyenv but supports
various languages via a plugin system (eg. [https://github.com/danhper/asdf-
python](https://github.com/danhper/asdf-python)).

~~~
Bogdanp
For those wondering (like I did): it doesn't seem like this has any relation
to Common Lisp's ASDF.

------
patagonia
Imagine I’m just getting started with Python, and I see this article. I think
to myself, “Awesome, a primer!”

Then I start reading (these comments)... mayyybe I should try Julia... or
anything else, at least while I’m still getting started.

~~~
Serow225
Does anyone know why the Python community has seemed to struggle with package
mgmt fragmentation/churn so much over the years, compared to other languages?
Did Guido just not really care about package mgmt?

~~~
xapata
Guido wrote Python in 1990. Did you use a package manager back then? I didn't.

Part of the issue is how well Python integrates with non-Python dependencies.
Before conda, when I wanted to upgrade some Python projects, I'd get errors
complaining about my Fortran compiler. These days, most of the major projects
upload precompiled binaries for major platforms to PyPI, but when it was just
source code...

~~~
Serow225
Sure, but couldn't it have been given some BFDL/high-level leadership
importance in the last five years to rein in the craziness?

~~~
jonafato
The PyPA team has done a lot over the past five years. The changelog for pip
([https://pip.pypa.io/en/stable/news/](https://pip.pypa.io/en/stable/news/))
contains quite a bit, PyPI was migrated to Warehouse, and there have been
several PEPs focused on improving the packaging situation. A lot of these
ideas come from various people in the community and get formalized as official
recommendations or tools, but these things take time, especially accounting
for backward compatibility in an ecosystem as large and mature as Python's.

The short answer to "why isn't this solved?" is "it's hard, and there's a lot
to do". Development practices change over time, and the tooling continues to
evolve with them. It's easy to see a broad survey like this and think that
there's too much going on, but taken at a high level, the space is definitely
trending in the right direction.

(Note: I'm not part of the PyPA, but I'm interested in this area and try to
follow along from the outside.)

~~~
Serow225
Understood, I guess I'm wondering why it hasn't been possible to cull more of
the less-successful attempts, or at least make it obvious to newer users what
is legacy. As an outsider/newer person to Python, the number of package mgmt
options to consider is vast and confusing, it would be helpful if there was
one (or a few) more "blessed" solutions :)

~~~
toyg
_> the number of package mgmt options to consider is vast and confusing_

Part of the issue is due to the success of Python in very different niches.
The likes of Rails or Node can concentrate on specific ecosystems, which
account for the bulk of their users and have a limited set of scenarios they
have to support; whereas Python users come from sysadmin to data-crunching to
web to desktop development to games to to to...

So each packaging tool comes with certain ideas, usually a result of the
author's experience; maybe they work very well in this or that scenario, but
then they break badly on others and sizeable chunks of the community revolt.
So a new tool comes around and the cycle starts again, but now people also
want compatibility with the old tool.

I suspect part of the solution will require splits between niches. It already
happened with Anaconda, which has basically become the standard in a
particular group of users (academia / datascience). Since that came around,
lamentations around building C libraries have substantially reduced (to be
fair, the arrival of precompiled wheels on PyPI also helped). Some similarly-
specialized tool might eventually emerge as standard for other niches.

Python developers are cats and they are pretty hard to herd at the best of
times, which is unsurprising -- who would stick around a language that is
almost 30 years old and was never promoted by any major vendor? Only hard-
headed fools like myself.

------
breatheoften
I’m looking forward to a future where I no longer have to use languages that
require the use of different mechanisms to reference functionality from
library code than one uses to take advantage of your own source ... all the
incidental complexity around custom compilation processes are in reality, just
enormously non-productive relics of the past.

In the future — you have a set of entry points to your program, these are
crawled by the language aware tool chain to identify and assemble all the
requirements for the program (including 3rd party functionality). There’s no
need for separate tools to manage packages, caches, and virtual environments —
let’s just put all this logic into the compiler(s) — where necessary let the
application describe the necessary state of the external world and empower
language toolchains to ensure that it’s so ... let’s live in the future
already ...

------
epage
The biggest problem I have with python packaging tools is how do I start using
them. I'd rather not install all of them in my global site-packages. Do I need
to create a virtualenv just to get a tool to manage my virtualenv's?

I have seen poetry is working on their bootstrapping story. I could not get
their current solution to work on Ubuntu. Maybe what they are developing
towards will work.

[https://github.com/sdispater/poetry/issues/342](https://github.com/sdispater/poetry/issues/342)

~~~
blattimwind
pip install --user

~~~
epage
Doesn't that just install into a global-to-the-user path? Isn't one of the
things we're trying to avoid is conflicts between these tools. For example,
pip is now more freely breaking compatibility. I now need to ensure if my
different packaging tools are compatible with my version of pip all installed
in my user location.

~~~
mixmastamyk
Yes, and that's fine for most folks. The one exception I make is developing a
huge app at work, use a virtualenv for that to keep it separate.

------
hultner
I've migrated to pipenv in most of my projects, it's simple and great for
application development but I still write everything to work with pure pip as
well so the Pipfile basically lists my application as a dependency and I
mainly use it for the lock files.

For library development I target pure pip/setuptools but still use pipenv
during development phase. There have been a few cases where pipenv had
problems and I had to either remove my virtualenv and reinitialize it or even
remove my pip-file/lockfile, but since I still have my setup.py it's not a big
deal for me.

As for uploading etc I use twine but I wrap everything in a makefile to make
handling easier.

A problem I noticed recently was a case where one of my developers used a tool
which was implicitly installed in the testing environment since it was a
subdependency of a testing tool but it was not installed into the production
image. This resulted in "faulty" code passing the CI/CD and got automatically
deployed to the live development environment where it broke (so it never
reached staging). Caused a little bit of a headache before I found the cause.

------
antpls
No mention of containers? I didn't write Python code since a while now, but it
would have been nice to have a comparison with container technologies, which
weren't available at the creation time of pypi and pip. Containers solve both
the problems of the article : isolation and repeatability, for any language.
Are virtual env tools still needed in the container era?

~~~
bovermyer
Containers would seem to solve the isolation part of the problem, but
dependency management is not something containers can deal with effectively.

~~~
antpls
Big monolithic Python projects will face dependency issues for sure. However,
softwares structured into simpler, smaller components, using the right
languages for the right tasks, will probably have simpler dependencies for
each modules.

That's what Go and tools like Bazel allows for : static builds, which forces
to modularize the project into smaller independent components.

In case of static builds, the protocol between components is the C ABI, or an
RPC protocol, but it could be a mesh of microservices too.

What is currently happening with the explosions of tools with Python is the
result (take it with a grain of salt, only my opinion) of people _only_
working with Python and not exploring enough outside of it

~~~
aprdm
Can you give practical examples of what you are mentioning? Like, how to
achieve ""static builds"" with python? Got me interested!

~~~
xapata
Dropbox does this for their Windows client, I think.

------
xapata
No mention of Anaconda?! How strange. I recommend using `conda` instead of
virtualenv, and instead of pip where possible.

A Python project does not only depend on Python modules, but non-Python
modules as well. Beyond Python, conda helps manage your other dependencies,
like your database. I use Miniconda instead of Anaconda, to avoid the initial
mega-download.

------
binalpatel
I find conda far and above the best tool to manage python packages and
dependencies. Being able to concisely contain all Python and binary
dependencies together is invaluable.

I recently wrote about it as blog post, using conda within containers has
solved almost every pain point we had with python packaging and how to get
things into production reliably.

~~~
orf
Counterpoint: it's terrible. A lack of a lockfile is a killer, plus it not
really fitting well with the general ecosystem (it's not really a _python_
dependency manager). It's an all or nothing tool, which sucks to be honest

Now most projects have wheels pip is pretty damn good.

The conda CLI is also just terrible. It's good for ad-hoc research, but for
big deployments? No thanks, I've had enough pain using it.

------
azag0
For pure-python library projects, I found Poetry the best option these days
(haven't tried Hatch). But it is still heavily under development, so it's not
necessarily a black-box solution.

The biggest pain point of Pipenv for me is that it cannot as yet selectively
update a single dependency without updating the whole environment.

------
erikb
If you only think in Python then packaging really might be a painpoint. But
honestly if we look at other languages it's not so bad actually, very
specifically calling out Golang here because it claimed exactly this topic as
an initial design goal and to this day has basically failed at delivering it.

There are even languages like C++ where the community as a (w)hole has given
up on that topic and instead opts for completely building every tool by
building the underlying libraries up first manually.

Considering all this, who can actually beat Python at this point? Java maybe?
Is Ruby still competing? How is NodeJS doing?

Currently with what I see around me (mostly Go and C++) I don't feel too bad
about setuptools+pip+virtualenv anymore.

~~~
johnlinvc
Ruby's bundler is pretty neat. It's one of the first systems that introduced
lock files.

------
dlitvakb
I have been using purely setuptools for all of our open source Python
libraries at Contentful, but have found that lately I've been getting
deprecation warnings from PyPI not to use `setup.py upload` anymore.

What should the alternative be now?

Edit: I'm reading about twine right now, but I cannot begin to comprehend why
it's not bundled directly if this is what they are intending for us to use to
upload packages.

~~~
di
Hello, I'm the person who deprecated `setup.py upload`. The warnings should be
telling you that `twine` is the preferred tool for uploading.

The reason for this is that right now, that command comes from `distutils`,
which is part of the standard library. There is a huge disadvantage to
bundling this functionality with your Python distribution, namely that it can
only get upgraded when you upgrade your Python distribution. A lot of folks
are still running versions of Python from several years ago, which is fine,
but it means that they are missing out on anything new that's been added in
the meantime.

For example, earlier this year, we released a new package metadata version
which allows people to specify their package descriptions with Markdown. This
required a new metadata field, which old versions of `distutils` know nothing
about.

Upgrading `distutils` to support it would require that these changes go though
the long process of making it into a Python release, and even then they would
only be available to folks using the latest release.

Moving this functionality from `distutils` to a tool like `twine` means that
new features can be made available nearly immediately (just have to make a
release to PyPI) and that they're available to users on any Python
distribution (just have to upgrade from PyPI).

The `distutils` standard library module comes from a time when we didn't have
PyPI and thus, didn't have a better way to distribute this code to users. We
have PyPI now though, so bundling `distutils` with Python is becoming less and
less useful.

~~~
1wd
Why not bundle twine like pip? In fact, why not merge the twine functionality
into pip?

~~~
di
> Why not bundle twine like pip?

The `pip` package is not actually bundled with your Python distribution,
instead the standard library has `ensurepip` which provides a means of
bootstrapping a `pip` installation without `pip` itself. See [0].

> In fact, why not merge the twine functionality into pip?

This has been considered and still might happen, see [1], specifically the
comment at [2].

[0]
[https://docs.python.org/3/library/ensurepip.html](https://docs.python.org/3/library/ensurepip.html)

[1] [https://github.com/pypa/packaging-
problems/issues/60](https://github.com/pypa/packaging-problems/issues/60)

[2] [https://github.com/pypa/packaging-
problems/issues/60#issueco...](https://github.com/pypa/packaging-
problems/issues/60#issuecomment-369238296)

~~~
1wd
> The `pip` package is not actually bundled with your Python distribution

It is bundled, as mentioned in the link [0] you posted: "pip is an independent
project with its own release cycle, and the latest available stable version is
bundled with maintenance and feature releases of the CPython reference
interpreter."

> the standard library has `ensurepip`

Ensurepip is for Python distributions, which are supposed to do use it
automatically to provide the bundled pip. See [3]: "Ensurepip is the mechanism
that Python uses to bundle pip with Python." Basically it's the installer of
the bundled pip. At least that's how I understand it.

> This has been considered and still might happen, see [1]

Note that while the users there all basically say the same thing (twine should
be merged into pip as "pip publish") the (two out of three) PyPA devs say it
"would be a major mistake" and they are "against adding pip publish". (Before
starting offtopic rants against poetry...) I somehow doubt this will improve
soon.

[3] [https://mail.python.org/mm3/archives/list/distutils-
sig@pyth...](https://mail.python.org/mm3/archives/list/distutils-
sig@python.org/thread/5L63NSCWKPPGVL7HKBAZGYPV2VDHYA5P/)

------
jimbo1qaz
Pipenv is unusable for me, since launching your app only works when your
current working directory is the Pipfile directory. If you want to launch an
app via a shell script from another directory, you have to first cd to the
Pipfile dir, pipenv shell (maybe you can pass in a second shell script as an
argument).

The article mentions Pipsi is designed to make command-line apps globally
accessible, and I'll try it out.

Additionally, adding git/src/package/module.py may be fine when you're using
an IDE, but when browsing in a file manager, you must navigate 3 directories
deep to even see any source files, which seems to be trending towards the
inconvenience amd pain of Java projects.

------
ericcholis
I gave conda a shot and found it to be better than pip + virtualenv, but still
not amazing.

~~~
fifnir
I did the same and found pip_venv to be much superior

------
EamonnMR
We tried Pipenv last year, ran into a number of bugs, this one being the most
irritating:

[https://github.com/pypa/pipenv/issues/786](https://github.com/pypa/pipenv/issues/786)

------
reilly3000
Why is it that Python is geared towards archiving packages at the site level
by default while npm, composer, et all tend towards including packages in the
project's folder? Is it convention from a time when disk space was less
plentiful?

~~~
gary_bernhardt
"Why" is hard, but Python's packaging system was created when global
installation was just how packaging worked. There may have been exceptions,
but the first local package installation tool I knew of was workingenv (2006).
It was the predecessor of virtualenv, which I think led directly to pipenv.

------
luord
I rarely even use requirements.txt and never use it in my personal projects.

I just pin the project's direct dependencies in the setup.py file and install
the folder directly. I know it might cause bugs with different developers (or
the CI) using different versions of the upstream dependencies but I guess I
trust the developers who create each library I'm using. The moment I directly
import something from what used to be an upstream dependency, I pin it too.

So far this approach hasn't given me trouble, but I'll still take a look at
poetry based on what I read in the comments here.

------
Gaelan
Huh, OpenDNS blocks this due to "a security threat that was discovered by the
Cisco Umbrella security researchers."

------
mattdeboard
Interestingly this URL gets blocked by my work's security thing. Never saw
that before.

edit: I requested an exemption but corp IT staff came back and said there's
definitely been malware identified on that site. So... be careful with your
clicks.

edit2: Well who knows where the malware alert is coming from, might be an ad
or something.

------
qwerty456127
Sadly there are things you can't always install reliably from the Python
repository. E.g. you may have to install things like scipy and keras from the
OS or a 3-rd party (like brew or conda) repository as pip install would fail
in the build process.

------
TheOtherHobbes
I was trying to set up pipenv on a Mac earlier in the week.

Being able to select P2 or P3 environments is great.

Unfortunately it decided all my packages were in /var/mail.

No patience to debug it, so I gave up on it.

------
agumonkey
(incf confusion)

------
liveoneggs
this incredible complexity is what docker is really good at simplifying

~~~
emptysea
Maybe for deployment, but this problem has been solved well by both Yarn and
Cargo.

~~~
liveoneggs
pretty sure this is about python

------
datavirtue
Who curates the packages to prevent security issues?

~~~
cpburns2009
This is a glaring issue with Python. Does any other language package
repository implement any security? (I honestly don't know the state with other
languages.)

