
Musings about Debian and Python - plessthanpt05
http://notes.pault.ag/debian-python/
======
fingerprinter
I didn't realize there was a fight going on.

It seems like this is where everyone settled and is a good solution:

Virtualenv + pip for development

system wide (apt, dpkg) python etc for system tools and packages.

This makes sense to me. Keep your personal code sandboxed from your system
tools.

As an aside, I know there are quite a few tools and programs in both Debian
and Ubuntu written in Python. I wonder if we'll look back in 5-10 years and
realize/think this was a mistake. Particularly for UI code. I love me some
Python, but I cringe every time I have to interact with a Python GUI
application.

~~~
nandhp
> I cringe every time I have to interact with a Python GUI application.

How are Python GUI applications worse than C GUI applications?

~~~
fingerprinter
Performance

~~~
smnrchrds
I never noticed any performance problem with Python-based GUI programs. Well,
calibre's start-up time is a little too long, but once it's running I don't
notice any slowness. If you have a special program in mind, would you explain
a bit about it?

~~~
fingerprinter
I use Ubuntu, so I don't know if these are in Debian as well, but Jockey and
Ubuntu Software Center come to mind.

My main issue with them is the common symptom where both "hang" or "freeze"
for a few seconds. Visually this can look like the window gets grayed out for
a bit (common in Jockey) or it it is just stuck for a few seconds with no
clear indication anything is happening (more common in Software Center).

EDIT: I should note that I've previously written a few PyGTK apps and have
recently started writing Ubuntu Touch apps via their Ubuntu SDK (Qt/QML). I've
found that Qt/QML/Ubuntu SDK is a much better experience in both writing the
app and the responsiveness of the resulting application. And I'm by no means
proficient in QML, I'm just starting out.

------
Steltek
I have a slight tangent to this topic: Why does every language need their own
package manager? I can understand avoiding a specific distro's PM but do we
need to reinvent the wheel for every single language? Do we need to poorly
reinvent Make just so the build syntax matches the source?

It seems like a better solution would be to declare a common interface, both
to users and to tools, for each language's package system. At least that way
there's some guarantee (or at least a strong hint to the author!) that any
given tool would support some nice-to-have extra functionality.

A use case for me is that I see a really neat thing Foo written in an
unfamiliar language. Install instructions go something like: "Just run
'simple_make foo' to try it out!" or perhaps "Just run 'curl [http://get-foo-
bar.baz/rootme](http://get-foo-bar.baz/rootme) | sh -'". Okay, now perhaps
it's installed. It might even be installed where I want but it probably won't
be. But what happens if I need to update it? How can I keep all of these pet
projects up to date? I have to go and learn easy_install, gem, leiningen, sbt,
maven, npm, go, quicklisp, cabal, etc.

I mean, I'm not too lazy to read up on each tool but it adds up quick and not
every tool supports an Autoconf-like --prefix or apt-get {update,upgrade}.

~~~
est
Because everyone hates centralized package info manager, for instance, Windows
Registry.

------
pedrocr
Where I like to draw the line between apt and rubygems/pip is between "things
that are base system" and "things that are app deployment". So in a typical
rails stack that would be apt installs the system-wide stack (apache, ruby,
passenger, rubygems) and then capistrano is used to deploy my webapps with
their gems from my dev machine to their app-specific directory in the
production server.

This way I can have my sysadmin hat on while setting up the server and depend
on debian/ubuntu to handle security upgrades and generally create a consistent
system. Then I can put my devops hat on and use capistrano and bundler to
manage the security/dependencies of my own code.

But I see where this breaks down. If the base stack is moving at a much faster
pace than the distributions (e.g., right now the version of passenger in
Ubuntu LTS is incredibly old) it's attractive to just ignore the system
packages and install everything from original sources (e.g., install ruby from
source with rbenv). But doing that is just throwing away the integration work
the distribution has done. I'd much rather include some extra repositories to
get updated versions that integrate through apt for the few things that I care
to upgrade faster than Ubuntu LTS allows me. Right now that's puppet,
passenger and a few more.

~~~
rlpb
I agree with you. The pain occurs when this breaks down. Take Vagrant, for
example. Brilliant tool. Depends on gems that weren't distro packaged for
years.

This made Vagrant unusable for anyone not deploying a ruby-based application
stack, unless sysadmins accepted that they were going to have to deal with the
extra work associated with multiple vendors and update mechanisms for each
language on the system. So those of us using other stacks could use Vagrant,
but only if we adopted Ruby's app-level packaging system. See
[https://news.ycombinator.com/item?id=2059964](https://news.ycombinator.com/item?id=2059964)
as an example of life being made difficult for distro packagers by the Ruby
community. Since Rubyists tend to insist that distro packages are bad, so this
is how it tends to stay.

Thankfully the packaging of Vagrant and dependencies has now happened
(finally; after years), and I can "apt-get install vagrant" now. But I hope
this illustrates the problem.

In the common use case, there is a dichotomy between distro and app-level
dependencies, packaging and development. But in the grand scheme of things we
all want to share our work across communities, and this dichotomy is a barrier
to this ideal. Dependencies don't just go in one direction with language
module repositories like pypi and rubygems at the top. Sometimes we want parts
of the system to depend on them, too.

~~~
pedrocr
I think the issue is one of development speed. Currently things like passenger
and puppet are behind on Ubuntu LTS packaging because their development cycle
is much faster than LTS releases. Apache on the other hand is well past that
point, while nginx is probably somewhere in between. Right now I install
passenger through gems but I'll be moving that to the just released deb
repository the phusion guys have just released and eventually I expect
development to quiet down enough that Ubuntu's packaging will be up to date
enough. So basically I expect software packaging to evolve in three steps:

1\. When in fast development get it from gems (my case with passenger now)

2\. When it starts getting mature (or as soon as you have resources to do
enough QA) build a deb repository (puppetlabs does this for puppet)

3\. When it becomes stable and boring just use what's in the system (apache
and thousands of other things)

Moving from step 1 to 2 is probably not natural for a lot of ruby (and python
to a lesser extent) developers because they see their software in the context
of apps that are "things you deploy to a host" not "things that are part of
the base system".

------
ekr
I don't want to start any argument, but there is a reason why Stallman
recommends writing GNU packages in C (so that users don't need to acquire and
install all kinds of dependencies, and not just them). I certainly think again
about installing something when I see that the software is written in Java or
Python, mostly because I don't like to bloat my system.

~~~
pekk
What actual concrete problem do you have with 'bloat' due to languages such as
perl?

------
don_draper
"I prefer virtualenv based setups for development"

I'm a _casual_ python user and I find virtualenv to be very helpful.

------
zipfle
The design of this blog is so good that I almost didn't notice how good it is.

~~~
paultag
(author here)

D'aw, thanks so much. I really appreciate that. I identify as a OS-level
hacker, and very focused on _code_ , not _design_ , so it means a lot to hear
such kind words when I put a bit of effort into it.

Thank you!

~~~
pyre
It was really confusing trying to see how you did the large letters at the
start of the paragraphs because Firefox's inspector didn't show the css rules
for :first-character since the DOM element selected was the <p>.

------
TheSwordsman
I think I agree with the blog post in its entirety, as a fellow Pythonista /
Debianite. The Debian packages that are released always seem to be top-notch,
stable, and fit for production. (Let's forget the OpenSSL incident, shall we?)
I don't fear that the new version is going to break my currently installed
software, and I'm not worried about dependencies somehow having been missed
causing havoc. It just works.

(And when it doesn't, it's because I did something deep within dpkg/apt I
should not have.)

I can't say I've always had the same experience with other distributions which
is the reason why I moved to Debian in the first place. Truthfully, I can't
ramble any specific scenarios off from the top of my head.

Any time I've had an issue with a Debian software package, the bug threads
have always been constructive with proponents for both sides explaining why it
should be one way over the other. Eventually, the best decision is made (even
if I personally disagree).

However, the sacrifice for this stability is the fact packages can become a
bit 'stale' when it comes to new versions. I don't mind sacrifices like this.
And if you need newer stuff, that's why backports exist.

If you want the latest and the greatest use pip. But for the love of all
things, couple your pip usage with a virtual environment. Hell, even if you
aren't using pip get in the habit of using a venv.

My only bad experience with virtualenvs was a recent Python security update
(related to /dev/random). The system libraries changed, the Python executable
in the venv did not, sadness ensued. Even then, once I figured out the issue,
it was a quick fix. Just re-init the virtualenv, and it fixed it for me. No
need to move code around.

In short, if I want a stable system I don't need to babysit, I go with Debian.
I trust the people maintaining the packages. I use virtualenvs for 90% of my
Python development, and use pip inside of those venvs.

Hasn't failed me, yet. Probably shouldn't jinx myself...

------
hcarvalhoalves
In my opinion package management is a non-solution. It doesn't solve many
obvious problems (all your software is tied inside release cycles, tied to
shared libraries and you can't have multiple versions of the same package
installed), plus it operates under assumptions that don't even make sense
anymore (we're not distributing 3rd party software bundled in floppy disks
anymore to tie dependencies together).

Installing applications as containerized packages (e.g., Gobolinux, OS X .app)
should be the norm. Both apt and pip get it wrong because they don't solve any
of the above problems by themselves (hopefully you can use pip+virtualenv to
achieve containerization, and more recently, LXC).

~~~
dec0dedab0de
So your point is that there should be no shared dependencies, or centralized
collections of trusted software?

~~~
hcarvalhoalves
> So your point is that there should be no shared dependencies

You can have both, where it makes sense, instead of everything being shared by
default (e.g., you can provide all your software a base OpenSSL version). See
how FreeBSD solves that with jails and sharedfs.

> or centralized collections of trusted software?

All you need for that is app signing (e.g., App Store). You can still update
apps independently, there's no dependency management.

------
emillon
The background of this post is a thread on the debian-python mailing list,
relative to a new PEP:

[http://lists.debian.org/debian-
python/2013/09/msg00049.html](http://lists.debian.org/debian-
python/2013/09/msg00049.html)

------
st3fan
Or do what I did: give up on Python packaging and use Go or a JVM based
language that produce single artifact deployables.

I have been in this Python mess far too long. Now I just make a .war file or
tell Go to compile for my target platform. Both result in single files that
can be deployed without post-processing, installing/compiling dependencies or
whatever ... it simplifies things for everyone. Developers, CI, packagers,
ops, security folks.

~~~
the_mitsuhiko
> or tell Go to compile for my target platform

JFTR: the absolutely same is possible in Python.

~~~
mattip
what is the py2exe equivalent for linux /OSX ?

~~~
oinksoft
py2pp for the Mac. It appears there is a py2deb and py2rpm for Linux, and I
imagine users of Arch, Slack, etc. would be expected to download a .tgz and
`setup.py install` it.

~~~
st3fan
Yeah py2deb or py2rpm are great. Now do the same for all your dependencies.
And their dependencies. And their dependencies. And their ...

Single artifact does not mean 'your code'. It means 'the whole damn project
including all dependencies recursively'.

Think myapp-1.0.war that contains all dependent JAR files. Drop it in a
container and it runs. Or your Go compiled tool that is one single static
executable that has no reference to external code or libraries. Run it and it
works.

I know how I can package my own Python code. I am not too worried about that.
I am however really tired of packaging the whole thing every time. And then
dealing with stupid things like python packages that need a C compiler to
build. Or specific patches to dependent C libraries. Or specific versions of
things that some distro might not have. Or dependencies for which I need a
newer version than the OS provides. Or .. sigh .. there are so many stupid
gotchas that are complete time wasters to deal with.

~~~
Goladus
The main thing you need to understand is that "source code" vs. "single
deployable artifact" are impossible to evaluate without considering platform
requirements.

Source code is very compatible across many architectures and operating systems
but needs compilation, configuration, and dependencies. Binaries are highly
compatible with a single platform and work out of the box. However they are
not very compatible with any other platform.

The downside is that if you want any feature not provided by the JVM's
abstraction layer, at the very least you are in the exact same boat as a
python developer.

------
lmm
Debian set a bad precedent with their tightly-integrated approach to packaging
perl. Users and developers (not unreasonably) expected python to work the same
way, with pypi/apt interoperating just like cpan/apt.

------
zdw
More background on Python's packaging methods and history:
[http://aosabook.org/en/packaging.html](http://aosabook.org/en/packaging.html)

I'd also add that native packages tend to be far more repeatable when you have
to deploy and update a lot of systems in any environment.

Note that with tools like fpm or other similar quick and dirty package
generation tools, in general don't have to stick to the upstream system's
development guidelines, but you get the benefits of using the native package
tools.

------
focusaurus
My post on this topic, which is in agreement with this post.
[http://peterlyons.com/problog/2012/09/managing-per-
project-i...](http://peterlyons.com/problog/2012/09/managing-per-project-
interpreters-and-the-path)

------
jamtan
I like to get the best of both pip and the distro's native package manager ...

[https://github.com/jordansissel/fpm](https://github.com/jordansissel/fpm)

------
auvrw
well-written (and -typeset) article, although i think there is probably less
controversy around this than the tone of the article suggests?

anyway, side note: switched from nm-applet to wicd-client and haven't looked
back.

~~~
paultag
(author here)

Thanks! Pairing the fonts took a little while, and fiddling with it took a bit
of time - so thank you! :)

Yeah. The goal here isn't to stir the pot in the middle of a massive flame-
war, but it's the sort of thing where I see a _part_ of one of these
viewpoints now and again.

The hope here is that this can be something that will be given out when
someone expresses a viewpoint that starts to stir up this same-old flame
again, as well as make people think a bit more seriously about issues
regarding integration between language (and native) package managers (on both
sides).

------
Siecje
virtualenv and pip for your own project and use the package to satisfy
requirements of installed programs.

------
jrochkind1
interesting that debian and perl have come to the same impasse as debian and
ruby.

