Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
RubyGem is from Mars, AptGet is from Venus (stakeventures.com)
16 points by pelle on Dec 4, 2008 | hide | past | favorite | 42 comments


One of my Rails projects runs only on aptitude-provided gems and it has been a breeze. Yes, I don't get the latest and "hottest" versions of popular gems, but everything works and very reliable.

With other project I installed gems and Ruby from the source and have been handling dependencies with RubyGems. Not nearly as smooth: maintaining this mess without google would have been close to impossible. Internet is full of blog posts and article about "installing X on Leopard or upgrading Y on Ubuntu without apt-get". And guess what: those posts make absolutely no sense in an aptitude-controlled environment, where everything just works: the entire system can be re-created with a single apt-get command and a copy of your /etc files.

And the author's position of fuck you all, or as he puts it: "I'm all about application, systems management is not my job" is precisely why Windows is such a piece of shit: system management is nobody's job there, everyone is all about "application". This is why after you install a few creations of these "application guys" the whole thing becomes an unusable mess and you start paying attention to "I'm a Mac" commercials.


Hmm, I don't remember writing "fuck you all" in my post. I'm just talking about there being different domains in play here. Different domains have different needs.

System management thanks in many ways to the great work of Debian but also virtualization has become a commodity more or less. Over the last 30+ years the best tools for the job have become more or less standardized. Which is great.

The application development space is a lot more dynamic though. You can not expect all unix sysadmins to know all about all the different programming languages packages. There is just too much for there to be a single gate keeper now.

The world is changing and for the better. Deal with it.


No, it's not about Linux sysadmins but rather about writing a robust software. Aptitude is an application, just like your Rails app, written by very very smart programmers. And those developers are moving at exactly same "tremendous speed" as you are, and their "core DNA" is not a bit slower than yours, and results of their work is as much of a "commodity" as yours.

I suggest you re-read articles you linked in your post, because they explain why RubyGems has issues and needs fixing, something you haven't even addressed.

Your post basically says "YOUR work is commodity, YOU are moving slower than us, while MY developemnt is fast and bleeding age, YOU should get over your priorities and ME should continue as I please". You even went as far as calling someone on "spreading FUD" while in reality Gunnar is absolutely right: RubyGems is dangerous on production systems and I dare you run sudo gem update --system on a server that your paycheck depends on. While sudo aptitude safe-upgrade runs pretty much automatically on thousands of mission critical Debian-based deployments. That's not FUD, pelle, that's called high-quality engineering.


Hold your horses Gregg. Maybe my language skills are not what they should be. Because that is not what I am attempting to say. I do not in any way mean to offend anyone. All I am attempting to say is that different people have different needs.

What I am saying is that there is a difference between the requirements of an OS and of a Web application. Aptitude is stricly speeking an application, but it is also system software.

My work depends on the Debian and OpenBSD guys being conservative and creating solid defaults. I want them to be so, as I don't want to deal with that.

I am fanatic about OpenBSD myself, which is way more conservative and opinionated than Debian. I have spent years working with servers, being forced to do low level work. My main dev machine was Gentoo for many years. I understand the issues.

Complex systems are based on layers. The Internet for example is based on layers. The lower the layer the more conservative it should be. The higher layers are more diverse and allow greater flexibility.

After all I didn't choose Ubuntu to run several of my servers for the sake of running Ubuntu. I choose them because I need a stable server OS that I can deploy my applications on.

For someone who is a core Debian developer that might be different. Maybe he does choose Debian for the sake of running Debian.

All I want for them to do is not break what I and 90% of other Ruby developers are doing. Also this is not just Rails. Almost all Ruby developers depend on Gem.

FUD is when you say something is dangerous without giving further reasons. So far in this debate, I have seen lots of people say RubyGem is flawed, but so far the only argument I have seen is that it doesn't follow FHS.

I can live without Debian unbreaking their RubyGem package. I am not scared to install from source. But I still don't understand the reasoning behind it.

If they don't want to support real RubyGems it would be better for Debian users to remove the package all together as opposed to the trojan horse version of ruby gems that it is.


With Python, which has the easy_install utility for installing its own packages, I use the virtualenv mechanism to create a "shadow" environment connected to the machine's main Python but with its own library/package folders.

With that, I am free to install anything I want in the secondary Python, I still have access to distro updates for libraries that I chose not override in my environment and I am completely cool I am not hosing my main Python install.

In the unlikely event there is nothing similar to it for Ruby, it would be trivial to develop it.


I don't believe Debian ever managed to deal with 3rd party package management tools well. E.g. XEmacs has its own package management and apt-get equivalent, and it never worked right in Debian. It's fundamentally hard problem to support different 3rd party release cycles while fulfilling traditional Debian goals of stability and correctness.

Having said that the situation is particularly poor in Debian ruby gem land. There are 2 kinds of people -- the ones that understand Unix dynamic library versioning scheme and the ones that do not; the one that understand while non-distribution software needs to go into /usr/local and the ones that don't; the one that understand why Debian needs all the complexity in dependency management and the ones that don't; the ones that understand Unix rule of silence and the ones that don't, etc. Debian and Ruby communities are on the opposite side of this divide.


Why can't Central Command for RubyGems just upload any gem it gets as a Debian and RedHat package?


I'll have a crack at auto-converting all gems into debian packages if someone can point me to the "this is what debian packages need to look like" spec. Or the CPAN module->debian package converter that I recently heard about @ OSDC conference. Email drnicwilliams at gmail.com


Sounds like a great project. You might want to ping the debian team, and see where things currently stand;

If they don't want to accept your auto-packaged debs, you could set up your own repository, and auto-push them there.. That way people could pull in the Ruby Gems from there on a regular basis, as soon as they are auto-updated, without waiting for them to be released officially.

Debian team policy: http://pkg-ruby-extras.alioth.debian.org/rubygems.html

How to create packages:

http://www.ibm.com/developerworks/linux/library/l-debpkg.htm...

http://tldp.org/HOWTO/html_single/Debian-Binary-Package-Buil...

http://quietsche-entchen.de/cgi-bin/wiki.cgi/CreatingDebianP...

They're basically just tarballs with a pre and postscript, questions via debconf, and the dependencies mapped out in a manifest.


Thx for the links. I've pinged the pkg-ruby-extras mailing list for help too.


Just for the sake of anyone in the future who is reading this thread, two more links-

http://pkg-ruby-extras.alioth.debian.org

http://github.com/fidothe/dpkg-tools/tree/master


Just wanted to say thanks. Similar issues years ago were sorted by cpam2rpm years and the Java Packaging Project. I'd love to have proper packages for Gems.


Coincidentally, I'm sitting here staring at a terminal for the few hours because of RubyGems.

As a systems administrator, I'm absolutely not a fan of Ruby gems for exactly the reason the article mentions-

They're designed to be convenient for the developer, but they make an end-run around all of the methods that we admins set up to make things easy to deploy, and safe to update.

For instance- At Darkenedsky, we have a fleet of Ubuntu 6.06 webservers that are configured to run Apache, Postgresql, etc.

One of our coders has recently made a change which means that I need to push a library out to all 50+ machines..

Normally, this is a pretty trivial affair.

    for i in `cat hostlist`; do ssh $i apt-get install  library; done
Unfortunately, this particularly library is a Ruby gem, so things aren't nearly as simple ;(

The first step is that I need to ssh into each machine, and manually install Rubygems. Rubygems aren't packaged for Ubuntu 6.06, so I need to do a source install and configure.

Once it's installed, I need to update gems to have the most recent package definitions, and then I can start to install the package I actually want.

So far, it's annoying in that there are several extra steps, but they're scriptable, so it's liveable.

Unfortunately, installing the packages isn't very scriptable at all. When I install, I'm presented with a menu-

    Select which gem to install for your platform (i486-linux)
    1. fastthread 1.0.1 (i386-mswin32)
    2. fastthread 1.0.1 (ruby)
    3. fastthread 1.0.1 (mswin32)
    4. fastthread 1.0 (ruby)
    5. fastthread 1.0 (mswin32)
    6. Skip this gem
    7. Cancel installation
If this were a non-gems library, and/or packaged as a normal Ubuntu release, I wouldn't need to manually choose one of these. I could set my choice in debconf (write it to a file), and then push it out everywhere.

If we're dealing with only one machine, like most developers do, this isn't a big deal. It downloads and installs the gem, and you're good to go. But when you're dealing with a fleet of machines, anything that requires manual interaction is a sin against man and god.

Debconf allows me to choose whatever level of interaction I want-

    "Ask me all details, even the trivial ones"
    "Only ask me the important stuff"
    "Use the choices provided in this file"
Why should I need to choose between 1.01 and 1.0 after I start the install?

On Ubuntu, if the packages were really so important that both needed to be available, I could choose to install something between

    apt-get install ruby-fastthread-1.0
or

    apt-get install ruby-fastthread-1.1
Again, we're down to one command. Nice and simple, no interaction needed.

Further compounding the pain is that this isn't a choice I should even have to make- Why would I want the win32 version of a library on by Linux box?

If I'm installing a packaged version, It's already been tested in this configuration, and it's designed to work with everything else on the system.

Too many servers give me an error on install-

    Building native extensions.  This could take a while...
    ERROR:  While executing gem ...      
    (Gem::Installer::ExtensionBuildError)
    ERROR: Failed to build gem native extension.
While that's certainly fixable, it means that I need to babysit the install far more than I do for something that's properly packaged.

After I choose what I want, I get to wait while the package downloads across the public internet.

When dealing with Ubuntu, all our servers are pointed at an in-house repository for packages- We keep a copy of everything we need on-site, and I can update it with one command.

This means that when installing software, it only has to travel to the server a few racks over, not out the internet at large. With gems, this system is broken- Each one reaches out to the public internet.

Not only is this is much slower, but I can't trust that a package is 100% identical.

Having one machine be different, even by a point release, can be a guge problem in trying to debug intermittent errors.

Finally, upgrading the packages are far more difficult that they have to be.

If a bug is discovered in a library, such as the zlib bug that came out a few years ago, I can update it trivially and easily across all the machines.

Ubuntu has built-in commands for updating security bugs, and it separates that from updating features.. That way, I don't need to move to the next version of a package, possibly breaking the code, to resolve a security bug.

Now, that's not to say that issues with Ruby gems couldn't be fixed.. I'm sure that one by one, they could be addressed- Scripting support could be beefed up, local repositories could be added. Upgrading could be improved, and configuration could be rewritten..

But even if they were to fix EVERYTHING on the list, it would still be layering one packaging system atop another, which means that we need to have two ways to do everything.

Two sets of config files. Two repositories of packages. Two places for bugs.

As the author suggests, the problem comes in that OS packages and Ruby gems are are focused on different audiences-

Developers love ruby gems because for the one machine they use to write the code, it's easy, fast, and convenient.

Admins hate gems because when you're trying to deal with a fleet of machines, it's none of the above.


Wow. I'm not going to claim that rubygems is the best package management system, or that it's better than debian's packaging system. But, some of your frustration comes from the fact that you know apt very well and you don't seem to know rubygems very well.

For example, you can setup your own local gem server. You can load it with only the gems you want to install on your fleet of machines and use 'gem install --source your-local-server fastthread', for example. You might even get around your "pick the right gem" problem by installing only the fastthread gem you need on your local server.

But, I think the real problem you have with rubygems is that you don't want to learn another package management system. I really don't think the rubygems devs can do anything to remedy that problem.


Great point, Locke, Thanks for replying.

I'll certainly admit that I don't know the gems packaging system as well as I know dpkg and aptitude, but I think that that criticism misses the larger issue.

You're certainly right that the gems system could be made to do everything that the Debian/Ubuntu system does, particularly if given time to mature. But as I pointed out in my post, this wouldn't fix the problem.

Apache, for example, has hundreds of modules, each of which have complex dependencies.. They're released on their own schedule, and they're released across lots of platforms. Firefox has it's own set of extensions, and updates- You see the same thing with CPAN, and Python modules.

But the more disparate systems you add, the more sets of configurations you need, the more repositories you need, and the more ways to update everything you need.

So for our servers, I could set up a Gems repo, an Apache repo, a System Software Repo, a Cpan repo. I could have separate upgrade commands for every software package that comes along.

But the complexity is increased with each addition to this system-

* If I want to get a list of everything that's installed under this system, I need to run X commands.

* During installations, I can't chain everything together, I need to run X set of separate installations.

* I need to review X different release chains, to see if there are packages I need to test before deploying to multiple machine

* And yes, I need to learn X different systems for installing software, rather than one system for installing system wide.

There are a lot of good reasons to have the gems system, and I think it's a great tool for individual systems, but for anything that's going farm-wide, It's substantially easier to have everything go through one tool.


I don't know that I particularly disagree with you. The one advantage that a language scoped packaging system has over a system-wide system is developer adoption.

Debian, Gentoo, Redhat, FreeBSD, etc all depend on heavily on volunteers to create the packages, manage dependencies, issue security announcements, feed bug reports and patches upstream to developers, etc. I've used Gentoo for a long time and I've watched it struggle in recent years as it loses volunteers. New releases take longer to reach portage, some software never finds its way to portage at all, and I find more and more broken packages. The only Ruby package I install from portage is Ruby itself.

The thing is, every Ruby developer uses RubyGems. There are no volunteers because the developers themselves create the packages for each release. I think this is more sustainable, even if I don't think RubyGems is as good as the packaging tools used by the major distributions.


That's a really great point, and one I hadn't considered.

Of course, in a fantasy ideal world, the way things would work is that the Ruby developers would create their own repository for .debs..

This would mean that if I want the latest and greatest files, I can add their repository to my existing package system, and everything integrates.

When I did an "apt-get upgrade" it would check the Ruby repository, and upgrade all my ruby packages, too..

Then, when a package had proven itself to be stable enough, it could be copied back into the standard Ubuntu/Debian repositories, so that everyone could have it, even if they didn't add the repositories.

Of course, in the real world, that would be far to much work for the Ruby developers, and would require that they customize for one specific OS.. But a man can dream, can't he? ;)


"But, I think the real problem you have with rubygems is that you don't want to learn another package management system. I really don't think the rubygems devs can do anything to remedy that problem."

You don't think the RubyGems devs can use the normal packaging system, and hence get rid of these odd requirements? Why not?


Because a lot of us either (a) want what RubyGems offers (multiple version side-by-side installations), (b) work on operating systems that don't have packaging systems and aren't willing to compromise the use of our language just because of <platform> bigotry, or (c) want something newer faster than <platform> can provide it without having to install from source and without breaking our existing apps (see point a again).

Unless and until Debian supports the concept of multiple version side-by-side installations which is IMO mandatory, I don't see this particular problem being resolved.


DPkg indeed already supports multiple side by side versions, as does RPM. I'm not sure why you think they don't.

I'm also unsure why wanting an app to install its files to the same standard directories that other apps do, and not require separate updating tools, constitutes bigotry. Can you explain?


1. If by "already supports multiple side-by-side versions" you mean the support for libfoo and libfoo2, this isn't multiple side by side. I mean being able to support foo 2.1 and 2.1.3 simultaneously. RubyGems can do this, and applications that only want 2.1 and nothing later can specify that. My understanding and experience that what RubyGems can do here isn't supported by dpkg or RPM; even if they are, the maintainers of Debian aren't interested in it because it doesn't fit their narrow worldview.

2. Even if what RubyGems does is supported by dpkg et al., this does nothing to help the language or programs written in the language support this without resorting to ugly, stupid, and nonsensical platform-specific hacks. We've already seen people on Ruby doing a few silly things (e.g., RUBY_PLATFORM =~ /win/ matches darwin, cygwin, and win32, but doesn't match mingw -- which is a closer match for win32 than darwin or cygwin ;). Doing "gem 'foo', '=2.1'" allows me to use a specific version (equivalent of -lfoo2 in linking in C++), but if I don't care, I can just require 'foo'. (The "require 'foo'" bit is required in any case; "gem 'foo', '=2.1'" doesn't require anything, but sets the load paths.)

3. The bigotry is in assuming that dpkg or RPM are appropriate for everyone. I would rather have a Ruby-centric system for Ruby rather than two dozen platform-specific systems that I have to know and support. Yes, I'm bigoted toward Ruby and don't care what platform I run Ruby on. I don't want to have to deal with the operating system's package management (or lack thereof, as is the case on systems with wider adoption than Linux).

RubyGems isn't perfect or a be-all, end-all packaging system. It doesn't try to be. What it does, however, it does very well and it does right.


1. I mean being able to support foo 2.1 and 2.1.3 simultaneously.

Like every Linux distro shipped GCC296 and GCC29x for years?

It seems the Ruby community (and the Python community to a lesser extent) have a Not Invented Here syndrome.

2. Even if what RubyGems does is supported by dpkg et al., this does nothing to help the language or programs written in the language support this

Yes it does. Dpkg and RPM provide a standard mechanism to query if software is installed (foo), including optionally what verion is installed (foo >= 2.1).

3. I think the greater bigotry exists in assuming that Ruby is somehow special, and deserves to write unsigned binaries all over systems although all other software does not. Python, Java, and other languages all make the effort to integrate into their environments rather than forcing a whole new mechanism of querying, fetching, downloading, verifying, and signature checking upon users.


Look. This is a real practical criticism of RubyGem which is absolutely valid. But these kinds of things are getting better.

Rails 2x and Merb 1.x with their support of Gem dependencies are a lot easier to manage now than they were in the past.

It would be relatively easy nowadays to manage one server with lots of different applications with different Rails and Gem dependencies purely using gems now. You would not be able to do this if Rails was installed via aptget unless a special require_aptget library was written.

RubyGem is improving a lot with every release. Poolparty and other similar tools for remote management look like they would be able to deal with many of these kinds of issues in the future.

I believe EngineYard maintain their own local gem server for speeding this up.


Great point, Pelle, Thanks.

I think that's exactly the problem, though.. Yes, I'm 100% that all of these problems can, and probably will, be fixed.

But even once they are, there will be two methods for doing everything, and that makes things more complex than they need to be.

It's one of the disadvantages of such a relatively unpredictable environment- Every tool has has to be able to work across a variety of platforms, which means that every tool has to assume the worst case. ;(

The Ruby devs can't assume packages are sane everywhere- Redhat packages, for instance, have been a world of hurt for years.. OS X and Windows don't even HAVE native packaging systems.

I entirely understand why it is the way that it is.. But I look forward to it getting better.

As another poster pointed out, in newer releases of Ubuntu, they DO package up the gems as individual .deb packages.

Essentially, the Ubuntu devs bypassed the gem manager, just as the gem manager itself bypassed Ubuntu.


Thanks a lot for your post e1ven. Yours is actually constructive and makes sense, a far cry from the usual "RubyGems sucks and I won't tell you why, but if you use it then you're an idiot"-kind of FUD. I've replied to your points at http://stakeventures.com/articles/2008/12/04/rubygem-is-from... (search for "Actually, the Hackernews comments were surprisingly constructive"; the post might now show up yet because it's still in the moderation queue)

A lot of your criticisms have already been addressed in the latest RubyGems version.


Doesn't gem install path_to_gem.gem work?

(I could have sworn there was something like that.) Though, most of your complaints are very vaild. Also add on that ruby gems LOOOVES to eat all the memory on your machine if at all possible...


In this particular case it's complicated by the fact that ruby gems isn't pre-packaged for Ubuntu 6.06, but in the general case, yes, that would make the installation component of it faster.

My frustration comes mainly in that Ubuntu makes nice, easy systems to keep everything consistent; Ruby gems make an end-run around these systems. No matter WHAT it might do on it's own, it bypasses the rules that the rest of the system lives by.


You CAN use apt-get to install ruby packages. But it totally totally sucks.

(And as for the Ubuntu thing. I keep around the tarball with the ruby gem source, and have a script that scps, untars, and runs the install script. I have to type in the root password, but thats about it. Its definitely not ideal, but there are worse things ;))


I deployed one project by creating a completely separate deployment area, /opt/projectname, which contains a separate tree, /opt/projectname/sw, where I manually compiled all the dependencies: ./configure --prefix=/opt/projectname/sw. I installed ruby and gems in there, as well as the specific version of Apache I wanted to use. On the downside, it means that managing upgrades comes problematic, as I have only isolated the bin/lib/etc/share mess into a separate tree, not eliminated it.

To make life easier, though, I'd like to use an alternative source-based package manager. I'm not sure how to do this, but I'd like to run the './configure --prefix=/usr && make && make install' invocation in an environment which tracks the files copied by the install process, and knows which files in the system my packages affected. At this point, it would (ideally) update the .deb database on the system. It could (alternatively) keep track of these files in its own database. Maybe a chroot trick could help implement this.


http://www.gnu.org/software/stow/ is meant to provide some of what you want there. Unfortunately, it seems to be abandoned, but it looks like http://xstow.sourceforge.net/ might be picking up the torch.


He doesn't address this point at all, calling security concerns 'FUD':

With apt-get, you have one thing to keep up to date in order to keep your system secure. It's so easy that you can automate it if needs be. It's also easy to keep track of security updates. With 'gem', now you have two things to keep track of.

I currently use apt-get along side the gem stuff to maintain my servers, but it's something I do worry about. I have no hesitation about trusting my servers to Debian or Ubuntu, but gem isn't quite so much a known quantity, and furthermore, I'm not sure the update system distinguishes between security updates and "hey, there's a new version of Rails (or whatever) that might break a bunch of stuff".

It's not an easy problem, that goes without saying, but I wouldn't be so quick to dismiss either side.


Maybe that's because the "security concerns" don't really explain what the actual concerns are, but only claim that they exist? If "RubyGems is a security hazard, I won't tell you why but it is so and if you disagree then you're an idiot" isn't FUD then what is?


It's a security hazard because it's an entirely different update channel for security updates, as well as a second system that runs with root privileges and installs things from other people. It is not an immediate threat, but it is worrisome.


I'm currently using gems because I haven't gotten to the level of complexity where it's made any difference. Also, I tend to operate on Windows and Linux interchangeably and using the same approach in each platform is easy. I suppose I might feel differently if I were not doing personal projects and in single user mode.

This article did bring up a question I had with Firefox on apt-based systems. Why would I install a Firefox plugin through apt rather than just through Firefox plugins? Is the argument basically the same--that there are fewer/older plugins in apt but they have been proven stable by someone you can decide to trust so you get to choose what you care about most?


For something like Firefox I'd stick with its built-in add-on manager. Mostly b/c there's not likely to even be many of them available through apt.


I imagine the Firefox thing is to make the initial setup of a machine faster (doing Firefox addons and other packages in one shot with apt-get), and possibly because the maintainer's didn't trust the security of Firefox's update mechanism. If you can rely on apt-get always being available, you tend to use it as a first resort.


Here's what I don't like about apt-get install on Debian. There is NO DOCUMENTATION. At least, not as far as I can tell.

For example, "apt-get install phpmyadmin"... and then what? It took me HOURS to figure out what to do next. There was no message saying, "Congrats, phpmyadmin has been installed in the following directory. Navigate to the following URL to start using it." Nothing. Or what about "apt-get install rails"? I have absolutely NO IDEA what that does. I would never build an application on that rails installation.

Rubygems, on the other hand, is unbelievably easy. "gem install whatever" and then in your application "require whatever". What could be simpler?


And this is exactly what the FHS is for. Documentation for "blah" lives in /usr/share/doc/blah/. Simple and easy.

As mentioned, Debian extends this by recommending a README.Debian to explain what differences from upstream documentation or defaults exist, and the rationale.

For example, enjoy the simplicity of /usr/share/doc/phpmyadmin/README.Debian. It answers all your questions, and also makes mention of security concerns you may not have realised exist had you, for example, just extracted a copy of phpmyadmin into your web site.

And now you have enough information to know what to do next with anything that you install using apt-get, Or yum, or rpm, or portage, or (I'm speculating here) autopackage for that matter.


README.Debian in /usr/share/doc/PACKAGE-NAME usually has this information. Of course, some package maintainers aren't as nice as others in this regard.

The problem is much simpler with Ruby packages since they have only one interface. Debian packages have many interfaces, and so it can never be as easy as that. I'll note that Debian/Ubuntu python packages work just as you expect: apt-get install python-twisted, then import twisted in your code.


How does Debian handle CPAN? Must you rely on apt-get for your Perl modules as well?


The debian <--> CPAN interface is easily managed, and there are more that a thousand CPAN modules in debian. (See the debian-perl web site: http://pkg-perl.alioth.debian.org/)

The great thing about debian packages is that they not only manage all your perl dependencies for you, but they also manage operating system dependencies too. Gems can't do that.

The Perl Foundation actually holds up debian's packaging work of CPAN modules as the model for other distributions. (See http://www.perlfoundation.org/perl5/index.cgi?hints_for_dist...)


I'm very late to this party. (not sure if this is still a hot issue)

Just thought I'd mention, that there seems to be some work in this area.

gem2deb - a gem to a *.deb (Debian/Ubuntu) package http://github.com/thwarted/gem2deb/tree/master

(and for rpm's) http://rubyforge.org/projects/gem2rpm/

a repo for rhel/deb/others http://rubyworks.rubyforge.org/

I'm sure there are others,...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: