
DepHell – Project Management for Python - BerislavLopac
https://dephell.org
======
sonofgod
My thoughts on the website:

I don't expect to have to click through to Github to get a summary of what
your project's about. Consider putting some of the text from your readme onto
the main project page, it's good copy and I think wants to be front and
centre.

I went to the docs first to try to find this, but bounced off the terse three
word descriptions and lists of verbs.

Good luck with the project!

~~~
fastball
Disagree. The GitHub and Docs do a great job of explaining the project. No
need to maintain more copy than necessary. The homepage is a jumping off
point. I wish more projects did it like this, because that just means there is
less potential for stuff to become outdated.

~~~
IanCal
The copy is on the website though, it's just behind the docs link. I get that
it's a jumping off point, but I'd like even just a sentence explaining what
I'm jumping off _to_.

------
kstrauser
Reposting this from the "is Pipenv dead?" thread:

We love DepHell. I migrated some work repos from Pipenv (and plain ol' pip) to
Poetry. However, we didn't want to have a flag day where we updated our build
tooling to be 100% Poetry, so I made a Makefile target that builds
requirements.txt and setup.py from pyproject.toml. Now developers can work
with pleasant tooling, but the build system can use the old stuff it already
knows.

We're close to having everything migrated to Poetry. When that day comes, we
can throw out all the compatibility stuff, update the build server, and be
happy. Until that day, DepHell gives us an easy compatibility layer so that we
don't have to do the migration all at once. It's awesome.

~~~
BerislavLopac
I'm curious -- are you using Poetry for applications, besides just the
libraries? If so, what is your workflow? I'm currently setting up some
workflows with Poetry so I'm keen to learn of others' experiences.

~~~
kstrauser
We are. pyproject.toml is now our official record of dependencies, command-
line scripts, etc. for all recently-touched repos. We've updated most built
tooling to use `poetry install` instead of `pip install -r requirements.txt`.
In places where we haven't yet, we have Makefile targets like:

    
    
      # Generate setup.py and requirements.py until we can get off that treadmill.
      # This *must* be done every time you update dependencies, as pyproject.toml is
      # now the official source of project configuration and packages.
      python_oldfiles:
              dephell deps convert; dephell deps convert --env requirements
    

and pyproject.toml blocks:

    
    
      [tool.dephell.main]
      from = {format = "poetry", path = "pyproject.toml"}
      to = {format = "setuppy", path = "setup.py"}
      
      [tool.dephell.requirements]
      from = {format = "poetry", path = "pyproject.toml"}
      to = {format = "pip", path = "requirements.txt"}
    

Now when a dev does `poetry add foo`, they can run `make python_oldfiles` to
autogenerate updated "compatibility" files.

------
armitron
The disaster zone that's Python packaging aside, this has to be called out
since it's a gaping security hole:

curl -L dephell.org/install | python3

* Trying 185.154.12.127:80...

* TCP_NODELAY set

* Connected to dephell.org (185.154.12.127) port 80 (#0)

> GET / HTTP/1.1

> Host: dephell.org

> User-Agent: curl/7.65.1

> Accept: _/_

> Referer:

>

* Mark bundle as not supporting multiuse

< HTTP/1.1 301 Moved Permanently

~~~
y4mi
yeah, that -L is the absolute cherry on top

you're begging to get owned at that point

------
milin
Now all we need we is another project management tool to manage DepHell.

------
gnulinux
> curl -L dephell.org/install | python3

JFC, I twitch every time I see this. "Download my script from my website and
pipe it to python/perl/ruby". I want to "rm -rf /" people's computers just to
stop this trend.

Please never do this! Always read the scripts you're going to run on your
computer.

~~~
lainproliant
It always icks me out when people recommend this. Like with `get-pip.py`...
It's fine to download that and run it to install pip, but for the love of GNU
please at least take a cursory glance at what you're running!

~~~
glofish
When you do a

    
    
      pip install this-or-that 
    

do you actually first download the source and look at what takes place?
probably not.

In that sense curling is better as you can see the code beforehand just pipe
it into a pager.

~~~
theamk
If "pip install" tries to install a system-wide package, it will break so many
things, the author will get many bug reports. Unless this is an abandoned
project, this will get fixed. And "pip install" does not need root access in
most cases.

If "| curl" does this, it is basically normal to do anything. And root access
is required in a lot of cases.

~~~
weberc2
Root access is orthogonal to the particular incantation you use to invoke the
Python file you downloaded from the Internet.

~~~
theamk
In theory, yes.

In practice, no. It is customary for "pip install" to not require root, and
for "| curl" to require it.

~~~
Godel_unicode
That's simply not true unless they're actually doing different things.

~~~
theamk
They are doing different things, and that is exactly the point! Here is a
random example:

"meteor.js" was a first relevant hit for curl|sh query on google. is
javascript app platform. You are supposed to use "curl | sh" to install it (
[https://install.meteor.com/](https://install.meteor.com/) ). This file:

\- Hardcodes install location to ~/.meteor (and removes previous location of
it).

\- Uses "sudo" to write to /usr/local/bin/meteor

Compare it with "scipy", which can be called a scientific app platform. It
tells you to install via pip to user dir (
[https://www.scipy.org/install.html](https://www.scipy.org/install.html) ). It
installs itself only to this dir, and nowhere else (I know it because I use it
at work a lot, and we do pre-packaged virtualenv here). It is also using
standard mechanisms -- if you want to have many version side-by-side, it is
trivial.

Can you write |sh script so it minds its own business and only writes to a
single directory? Yes. Do people do this? Not very often.

\-----

When I wrote this, I thought: maybe I am biased towards curl|sh method because
I don't know of one? So I want to HN front page, looked at last 210 entries,
chosen every one which looks like a a software installable on a PC, and tried
to evalalate the system impact:

\- Dephell: curl | python, no side effects other than forced install location.

\- neo.mjs: installed via node, presumably no side effects outside of
node/project dir

\- huginn: manual steps, all manual -- or docker. No surprises either way.

\- ponylang: docker or PPA. No surprises.

\- Poetry: installed via "curl|python", modifies my .bashrc to set PATH (to be
fair, it told me about this afterwards...)

\- Uni the unicode database: use "go get", no side effects

\- Qt 5.1 -- commercial, "installer app".

\- Virtualbox 6.1 -- installed via .deb file (system-wide but expected)

\- event-driven-shell -- run from repo, optional "make install"

We've had 2 "curl |" apps. One of them was modifying my ~/.bashrc.

We've had 4 "traditional" apps -- which used "checkout repo and run command"
method. None of them were writing stuff outside of their checkout dir. Some of
them explicitly recommended that users change their .bashrc.

We've also had some docker apps and .deb-installed apps. Of them, virtualbox
and Qt could write all over the places -- but they are much more "heavy
weight" compared to other ones..

\----

This is not a very big sample, but I think it is pretty representative. Once
piping things into shell, it seems people cannot help but install stuff all
over the place. It is just like "make install" was -- except it is not
optional this time.

------
dang
Related current threads:

[https://news.ycombinator.com/item?id=21779191](https://news.ycombinator.com/item?id=21779191)

[https://news.ycombinator.com/item?id=21781421](https://news.ycombinator.com/item?id=21781421)

------
PaulHoule
This: "So, if you can’t install by the recommend way and have some conflicts
with your global modules, install without it"

Maybe that's why it is called DepHell instead of DepHeaven.

If you don't have access to clean Python interpreters that don't have "global
modules", including in the user's own "site" directory, eventually you will be
praying that your builds work.

For a system like "dephell" to have a chance of solving the problem that it
tries to solve, it has to have a "place to stand from where to move the Earth"
\-- without it, it's just a faster way to trash your Python install and have
to reinstall it.

------
ablekh
Could someone briefly compare (pros/cons) this project with Poetry?

------
GuyOnMySpace
Python packaging is definitely in need of something, but my sense is that "yet
another tool trying to be a catch-all package/environment management
interface" is not it.

