

RubyGem is from Mars, AptGet is from Venus - pelle
http://stakeventures.com/articles/2008/12/04/rubygem-is-from-mars-aptget-is-from-venus

======
e1ven
Coincidentally, I'm sitting here staring at a terminal for the few hours
because of RubyGems.

As a systems administrator, I'm absolutely not a fan of Ruby gems for exactly
the reason the article mentions-

They're designed to be convenient for the developer, but they make an end-run
around all of the methods that we admins set up to make things easy to deploy,
and safe to update.

For instance- At Darkenedsky, we have a fleet of Ubuntu 6.06 webservers that
are configured to run Apache, Postgresql, etc.

One of our coders has recently made a change which means that I need to push a
library out to all 50+ machines..

Normally, this is a pretty trivial affair.

    
    
        for i in `cat hostlist`; do ssh $i apt-get install  library; done
    

Unfortunately, this particularly library is a Ruby gem, so things aren't
nearly as simple ;(

The first step is that I need to ssh into each machine, and manually install
Rubygems. Rubygems aren't packaged for Ubuntu 6.06, so I need to do a source
install and configure.

Once it's installed, I need to update gems to have the most recent package
definitions, and then I can start to install the package I actually want.

So far, it's annoying in that there are several extra steps, but they're
scriptable, so it's liveable.

Unfortunately, installing the packages isn't very scriptable at all. When I
install, I'm presented with a menu-

    
    
        Select which gem to install for your platform (i486-linux)
        1. fastthread 1.0.1 (i386-mswin32)
        2. fastthread 1.0.1 (ruby)
        3. fastthread 1.0.1 (mswin32)
        4. fastthread 1.0 (ruby)
        5. fastthread 1.0 (mswin32)
        6. Skip this gem
        7. Cancel installation
    

If this were a non-gems library, and/or packaged as a normal Ubuntu release, I
wouldn't need to manually choose one of these. I could set my choice in
debconf (write it to a file), and then push it out everywhere.

If we're dealing with only one machine, like most developers do, this isn't a
big deal. It downloads and installs the gem, and you're good to go. But when
you're dealing with a fleet of machines, anything that requires manual
interaction is a sin against man and god.

Debconf allows me to choose whatever level of interaction I want-

    
    
        "Ask me all details, even the trivial ones"
        "Only ask me the important stuff"
        "Use the choices provided in this file"
    

Why should I need to choose between 1.01 and 1.0 after I start the install?

On Ubuntu, if the packages were really so important that both needed to be
available, I could choose to install something between

    
    
        apt-get install ruby-fastthread-1.0
    

or

    
    
        apt-get install ruby-fastthread-1.1
    

Again, we're down to one command. Nice and simple, no interaction needed.

Further compounding the pain is that this isn't a choice I should even have to
make- Why would I want the win32 version of a library on by Linux box?

If I'm installing a packaged version, It's already been tested in this
configuration, and it's designed to work with everything else on the system.

Too many servers give me an error on install-

    
    
        Building native extensions.  This could take a while...
        ERROR:  While executing gem ...      
        (Gem::Installer::ExtensionBuildError)
        ERROR: Failed to build gem native extension.
    

While that's certainly fixable, it means that I need to babysit the install
far more than I do for something that's properly packaged.

After I choose what I want, I get to wait while the package downloads across
the public internet.

When dealing with Ubuntu, all our servers are pointed at an in-house
repository for packages- We keep a copy of everything we need on-site, and I
can update it with one command.

This means that when installing software, it only has to travel to the server
a few racks over, not out the internet at large. With gems, this system is
broken- Each one reaches out to the public internet.

Not only is this is _much_ slower, but I can't trust that a package is 100%
identical.

Having one machine be different, even by a point release, can be a guge
problem in trying to debug intermittent errors.

Finally, upgrading the packages are far more difficult that they have to be.

If a bug is discovered in a library, such as the zlib bug that came out a few
years ago, I can update it trivially and easily across all the machines.

Ubuntu has built-in commands for updating security bugs, and it separates that
from updating features.. That way, I don't need to move to the next version of
a package, possibly breaking the code, to resolve a security bug.

Now, that's not to say that issues with Ruby gems couldn't be fixed.. I'm sure
that one by one, they could be addressed- Scripting support could be beefed
up, local repositories could be added. Upgrading could be improved, and
configuration could be rewritten..

But even if they were to fix EVERYTHING on the list, it would still be
layering one packaging system atop another, which means that we need to have
two ways to do everything.

Two sets of config files. Two repositories of packages. Two places for bugs.

As the author suggests, the problem comes in that OS packages and Ruby gems
are are focused on different audiences-

Developers love ruby gems because for the one machine they use to write the
code, it's easy, fast, and convenient.

Admins hate gems because when you're trying to deal with a fleet of machines,
it's none of the above.

~~~
Locke
Wow. I'm not going to claim that rubygems is the best package management
system, or that it's better than debian's packaging system. But, some of your
frustration comes from the fact that you _know_ apt very well and you don't
_seem_ to know rubygems very well.

For example, you can setup your own local gem server. You can load it with
only the gems you want to install on your fleet of machines and use 'gem
install --source _your-local-server_ fastthread', for example. You might even
get around your "pick the right gem" problem by installing only the fastthread
gem you need on your local server.

But, I think the real problem you have with rubygems is that _you don't want
to learn another package management system_. I really don't think the rubygems
devs can do anything to remedy _that problem_.

~~~
e1ven
Great point, Locke, Thanks for replying.

I'll certainly admit that I don't know the gems packaging system as well as I
know dpkg and aptitude, but I think that that criticism misses the larger
issue.

You're certainly right that the gems system could be made to do everything
that the Debian/Ubuntu system does, particularly if given time to mature. But
as I pointed out in my post, this wouldn't fix the problem.

Apache, for example, has hundreds of modules, each of which have complex
dependencies.. They're released on their own schedule, and they're released
across lots of platforms. Firefox has it's own set of extensions, and updates-
You see the same thing with CPAN, and Python modules.

But the more disparate systems you add, the more sets of configurations you
need, the more repositories you need, and the more ways to update everything
you need.

So for our servers, I could set up a Gems repo, an Apache repo, a System
Software Repo, a Cpan repo. I could have separate upgrade commands for every
software package that comes along.

But the complexity is increased with each addition to this system-

* If I want to get a list of everything that's installed under this system, I need to run X commands.

* During installations, I can't chain everything together, I need to run X set of separate installations.

* I need to review X different release chains, to see if there are packages I need to test before deploying to multiple machine

* And yes, I need to learn X different systems for installing software, rather than one system for installing system wide.

There are a lot of good reasons to have the gems system, and I think it's a
great tool for individual systems, but for anything that's going farm-wide,
It's _substantially_ easier to have everything go through one tool.

~~~
Locke
I don't know that I particularly disagree with you. The one advantage that a
language scoped packaging system has over a system-wide system is developer
adoption.

Debian, Gentoo, Redhat, FreeBSD, etc all depend on heavily on volunteers to
create the packages, manage dependencies, issue security announcements, feed
bug reports and patches upstream to developers, etc. I've used Gentoo for a
long time and I've watched it struggle in recent years as it loses volunteers.
New releases take longer to reach portage, some software never finds its way
to portage at all, and I find more and more broken packages. The only Ruby
package I install from portage is Ruby itself.

The thing is, _every_ Ruby developer uses RubyGems. There are no volunteers
because the developers themselves create the packages for each release. I
_think_ this is more sustainable, even if I _don't_ think RubyGems is as good
as the packaging tools used by the major distributions.

~~~
e1ven
That's a really great point, and one I hadn't considered.

Of course, in a fantasy ideal world, the way things would work is that the
Ruby developers would create their own repository for .debs..

This would mean that if I want the latest and greatest files, I can add their
repository to my existing package system, and everything integrates.

When I did an "apt-get upgrade" it would check the Ruby repository, and
upgrade all my ruby packages, too..

Then, when a package had proven itself to be stable enough, it could be copied
back into the standard Ubuntu/Debian repositories, so that everyone could have
it, even if they didn't add the repositories.

Of course, in the real world, that would be far to much work for the Ruby
developers, and would require that they customize for one specific OS.. But a
man can dream, can't he? ;)

------
old-gregg
One of my Rails projects runs _only_ on aptitude-provided gems and it has been
a breeze. Yes, I don't get the latest and "hottest" versions of popular gems,
but everything _works_ and very reliable.

With other project I installed gems and Ruby from the source and have been
handling dependencies with RubyGems. Not nearly as smooth: maintaining this
mess without google would have been close to impossible. Internet is full of
blog posts and article about "installing X on Leopard or upgrading Y on Ubuntu
without apt-get". And guess what: those posts make absolutely no sense in an
aptitude-controlled environment, where everything just works: the entire
system can be re-created with a single apt-get command and a copy of your /etc
files.

And the author's position of _fuck you all_ , or as he puts it: _"I'm all
about application, systems management is not my job"_ is precisely why Windows
is such a piece of shit: system management is nobody's job there, everyone is
all about "application". This is why after you install a few creations of
these "application guys" the whole thing becomes an unusable mess and you
start paying attention to "I'm a Mac" commercials.

~~~
pelle
Hmm, I don't remember writing "fuck you all" in my post. I'm just talking
about there being different domains in play here. Different domains have
different needs.

System management thanks in many ways to the great work of Debian but also
virtualization has become a commodity more or less. Over the last 30+ years
the best tools for the job have become more or less standardized. Which is
great.

The application development space is a lot more dynamic though. You can not
expect all unix sysadmins to know all about all the different programming
languages packages. There is just too much for there to be a single gate
keeper now.

The world is changing and for the better. Deal with it.

~~~
old-gregg
No, it's not about Linux sysadmins but rather about writing a robust software.
Aptitude is an application, just like your Rails app, written by very very
smart programmers. And those developers are moving at exactly same "tremendous
speed" as you are, and their "core DNA" is not a bit slower than yours, and
results of their work is as much of a "commodity" as yours.

I suggest you re-read articles you linked in your post, because they explain
why RubyGems has issues and needs fixing, something you haven't even
addressed.

Your post basically says _"YOUR work is commodity, YOU are moving slower than
us, while MY developemnt is fast and bleeding age, YOU should get over your
priorities and ME should continue as I please"_. You even went as far as
calling someone on "spreading FUD" while in reality Gunnar is absolutely
right: RubyGems is dangerous on production systems and I dare you run sudo gem
update --system on a server that your paycheck depends on. While sudo aptitude
safe-upgrade runs pretty much automatically on thousands of mission critical
Debian-based deployments. That's not FUD, pelle, that's called high-quality
engineering.

~~~
pelle
Hold your horses Gregg. Maybe my language skills are not what they should be.
Because that is not what I am attempting to say. I do not in any way mean to
offend anyone. All I am attempting to say is that different people have
different needs.

What I am saying is that there is a difference between the requirements of an
OS and of a Web application. Aptitude is stricly speeking an application, but
it is also system software.

My work depends on the Debian and OpenBSD guys being conservative and creating
solid defaults. I want them to be so, as I don't want to deal with that.

I am fanatic about OpenBSD myself, which is way more conservative and
opinionated than Debian. I have spent years working with servers, being forced
to do low level work. My main dev machine was Gentoo for many years. I
understand the issues.

Complex systems are based on layers. The Internet for example is based on
layers. The lower the layer the more conservative it should be. The higher
layers are more diverse and allow greater flexibility.

After all I didn't choose Ubuntu to run several of my servers for the sake of
running Ubuntu. I choose them because I need a stable server OS that I can
deploy my applications on.

For someone who is a core Debian developer that might be different. Maybe he
does choose Debian for the sake of running Debian.

All I want for them to do is not break what I and 90% of other Ruby developers
are doing. Also this is not just Rails. Almost all Ruby developers depend on
Gem.

FUD is when you say something is dangerous without giving further reasons. So
far in this debate, I have seen lots of people say RubyGem is flawed, but so
far the only argument I have seen is that it doesn't follow FHS.

I can live without Debian unbreaking their RubyGem package. I am not scared to
install from source. But I still don't understand the reasoning behind it.

If they don't want to support real RubyGems it would be better for Debian
users to remove the package all together as opposed to the trojan horse
version of ruby gems that it is.

------
rbanffy
With Python, which has the easy_install utility for installing its own
packages, I use the virtualenv mechanism to create a "shadow" environment
connected to the machine's main Python but with its own library/package
folders.

With that, I am free to install anything I want in the secondary Python, I
still have access to distro updates for libraries that I chose not override in
my environment and I am completely cool I am not hosing my main Python
install.

In the unlikely event there is nothing similar to it for Ruby, it would be
trivial to develop it.

------
gleb
I don't believe Debian ever managed to deal with 3rd party package management
tools well. E.g. XEmacs has its own package management and apt-get equivalent,
and it never worked right in Debian. It's fundamentally hard problem to
support different 3rd party release cycles while fulfilling traditional Debian
goals of stability and correctness.

Having said that the situation is particularly poor in Debian ruby gem land.
There are 2 kinds of people -- the ones that understand Unix dynamic library
versioning scheme and the ones that do not; the one that understand while non-
distribution software needs to go into /usr/local and the ones that don't; the
one that understand why Debian needs all the complexity in dependency
management and the ones that don't; the ones that understand Unix rule of
silence and the ones that don't, etc. Debian and Ruby communities are on the
opposite side of this divide.

------
lsb
Why can't Central Command for RubyGems just upload any gem it gets as a Debian
and RedHat package?

~~~
drnic
I'll have a crack at auto-converting all gems into debian packages if someone
can point me to the "this is what debian packages need to look like" spec. Or
the CPAN module->debian package converter that I recently heard about @ OSDC
conference. Email drnicwilliams at gmail.com

~~~
e1ven
Sounds like a great project. You might want to ping the debian team, and see
where things currently stand;

If they don't want to accept your auto-packaged debs, you could set up your
own repository, and auto-push them there.. That way people could pull in the
Ruby Gems from there on a regular basis, as soon as they are auto-updated,
without waiting for them to be released officially.

Debian team policy: <http://pkg-ruby-extras.alioth.debian.org/rubygems.html>

How to create packages:

[http://www.ibm.com/developerworks/linux/library/l-debpkg.htm...](http://www.ibm.com/developerworks/linux/library/l-debpkg.html)

[http://tldp.org/HOWTO/html_single/Debian-Binary-Package-
Buil...](http://tldp.org/HOWTO/html_single/Debian-Binary-Package-Building-
HOWTO/)

[http://quietsche-entchen.de/cgi-
bin/wiki.cgi/CreatingDebianP...](http://quietsche-entchen.de/cgi-
bin/wiki.cgi/CreatingDebianPackages)

They're basically just tarballs with a pre and postscript, questions via
debconf, and the dependencies mapped out in a manifest.

~~~
drnic
Thx for the links. I've pinged the pkg-ruby-extras mailing list for help too.

~~~
e1ven
Just for the sake of anyone in the future who is reading this thread, two more
links-

<http://pkg-ruby-extras.alioth.debian.org>

<http://github.com/fidothe/dpkg-tools/tree/master>

------
gcv
I deployed one project by creating a completely separate deployment area,
/opt/projectname, which contains a separate tree, /opt/projectname/sw, where I
manually compiled all the dependencies: ./configure
--prefix=/opt/projectname/sw. I installed ruby and gems in there, as well as
the specific version of Apache I wanted to use. On the downside, it means that
managing upgrades comes problematic, as I have only isolated the
bin/lib/etc/share mess into a separate tree, not eliminated it.

To make life easier, though, I'd like to use an alternative source-based
package manager. I'm not sure how to do this, but I'd like to run the
'./configure --prefix=/usr && make && make install' invocation in an
environment which tracks the files copied by the install process, and knows
which files in the system my packages affected. At this point, it would
(ideally) update the .deb database on the system. It could (alternatively)
keep track of these files in its own database. Maybe a chroot trick could help
implement this.

~~~
inklesspen
<http://www.gnu.org/software/stow/> is meant to provide some of what you want
there. Unfortunately, it seems to be abandoned, but it looks like
<http://xstow.sourceforge.net/> might be picking up the torch.

------
davidw
He doesn't address this point at all, calling security concerns 'FUD':

With apt-get, you have one thing to keep up to date in order to keep your
system secure. It's so easy that you can automate it if needs be. It's also
easy to keep track of security updates. With 'gem', now you have two things to
keep track of.

I currently use apt-get along side the gem stuff to maintain my servers, but
it's something I do worry about. I have no hesitation about trusting my
servers to Debian or Ubuntu, but gem isn't quite so much a known quantity, and
furthermore, I'm not sure the update system distinguishes between security
updates and "hey, there's a new version of Rails (or whatever) that might
break a bunch of stuff".

It's not an easy problem, that goes without saying, but I wouldn't be so quick
to dismiss either side.

~~~
FooBarWidget
Maybe that's because the "security concerns" don't really explain what the
actual concerns are, but only claim that they exist? If "RubyGems is a
security hazard, I won't tell you why but it is so and if you disagree then
you're an idiot" isn't FUD then what is?

~~~
davidw
It's a security hazard because it's an entirely different update channel for
security updates, as well as a second system that runs with root privileges
and installs things from other people. It is not an immediate threat, but it
is worrisome.

------
justindz
I'm currently using gems because I haven't gotten to the level of complexity
where it's made any difference. Also, I tend to operate on Windows and Linux
interchangeably and using the same approach in each platform is easy. I
suppose I might feel differently if I were not doing personal projects and in
single user mode.

This article did bring up a question I had with Firefox on apt-based systems.
Why would I install a Firefox plugin through apt rather than just through
Firefox plugins? Is the argument basically the same--that there are
fewer/older plugins in apt but they have been proven stable by someone you can
decide to trust so you get to choose what you care about most?

~~~
graywh
For something like Firefox I'd stick with its built-in add-on manager. Mostly
b/c there's not likely to even be many of them available through apt.

------
stedwick
Here's what I don't like about apt-get install on Debian. There is NO
DOCUMENTATION. At least, not as far as I can tell.

For example, "apt-get install phpmyadmin"... and then what? It took me HOURS
to figure out what to do next. There was no message saying, "Congrats,
phpmyadmin has been installed in the following directory. Navigate to the
following URL to start using it." Nothing. Or what about "apt-get install
rails"? I have absolutely NO IDEA what that does. I would never build an
application on that rails installation.

Rubygems, on the other hand, is unbelievably easy. "gem install whatever" and
then in your application "require whatever". What could be simpler?

~~~
TBBle
And this is exactly what the FHS is for. Documentation for "blah" lives in
/usr/share/doc/blah/. Simple and easy.

As mentioned, Debian extends this by recommending a README.Debian to explain
what differences from upstream documentation or defaults exist, and the
rationale.

For example, enjoy the simplicity of /usr/share/doc/phpmyadmin/README.Debian.
It answers all your questions, and also makes mention of security concerns you
may not have realised exist had you, for example, just extracted a copy of
phpmyadmin into your web site.

And now you have enough information to know what to do next with anything that
you install using apt-get, Or yum, or rpm, or portage, or (I'm speculating
here) autopackage for that matter.

------
dwright
I'm very late to this party. (not sure if this is still a hot issue)

Just thought I'd mention, that there seems to be some work in this area.

gem2deb - a gem to a *.deb (Debian/Ubuntu) package
<http://github.com/thwarted/gem2deb/tree/master>

(and for rpm's) <http://rubyforge.org/projects/gem2rpm/>

a repo for rhel/deb/others <http://rubyworks.rubyforge.org/>

I'm sure there are others,...

------
ruby_roo
How does Debian handle CPAN? Must you rely on apt-get for your Perl modules as
well?

~~~
jeremiah
The debian <\--> CPAN interface is easily managed, and there are more that a
thousand CPAN modules in debian. (See the debian-perl web site: <http://pkg-
perl.alioth.debian.org/>)

The great thing about debian packages is that they not only manage all your
perl dependencies for you, but they also manage operating system dependencies
too. Gems can't do that.

The Perl Foundation actually holds up debian's packaging work of CPAN modules
as the model for other distributions. (See
[http://www.perlfoundation.org/perl5/index.cgi?hints_for_dist...](http://www.perlfoundation.org/perl5/index.cgi?hints_for_distributors))

