
Let's Package JQuery: A JavaScript Packaging Dystopian Novella - davexunit
http://dustycloud.org/blog/javascript-packaging-dystopia/
======
jamielinux
T.C. and I worked on getting jQuery packaged for Fedora [1].

It's quite painful trying to package npm modules in a way that's suitable for
distributions like Red Hat/Fedora/Debian. The dependency tree just goes on and
on and on, and there are versioning issues all over the shop. And if you want
to run test suites in the package build process then you'd better package
mocha, tap, tape, nodeunit, should, vows, expresso, jasmine, supertest etc.
And all of their dependencies too. And make sure every test in the test suites
for every module pass. By the end of it all, you're a broken, hollow shell of
the person you were before, but at least now it's possible to run "yum install
js-jquery" and get version 2.1.3 :-)

[1]:
[https://fedoraproject.org/wiki/Changes/jQuery](https://fedoraproject.org/wiki/Changes/jQuery)

~~~
seanp2k2
Honest question, not trolling: why would I want to install a front-end JS lib
using a system package manager? What is the use-case for doing this?

~~~
icebraining
Self-hosted web apps (the linked post mentions MediaGoblin). The idea is that
a user can run 'apt-get/yum install mediagoblin' on his home computer and have
a personal media server.

------
Animats
From the article: _" Our deployment and build setups have gotten so
complicated that I doubt anyone really has a decent understanding of what is
going on, really."_

There's now a school of thought in the Rust world that it's easier to just
build a static executable and ship that.

~~~
davexunit
>There's now a school of thought in the Rust world that it's easier to just
build a static executable and ship that.

If that's where we're headed, I'm extremely sad. Static linking everything is
a lazy non-solution, IMO. Rampant duplication, increased resource usage, etc.
Not good.

------
daleharvey

      > It doesn't help that, for a long time, the 
      > status quo in all free software web applications 
      > (and indeed all web applications) was to check 
      > javascript and similar served-to-client web 
      > assets straight into your repository.
    

That really does help, this would have been a short blog post if they were to
follow the status quo.

Operating systems trying to deal with managing applications dependencies has
been a huge source of fustration for me. You cant have jquery randomly
upgraded for your application without having tested it, that is always going
to cause problems no matter how religious you are with semver.

Anyone who has developed erlang on ubuntu can attest to how ridiculous the
situation can get (debian split erlang up into seperate packages).

I very much lean towards having my dependencies bundled. I have misgivings
with npm but it does a lot of things right in that regard.

------
raziel2p
I fail to see the problem. Just create a .tar.gz with all the dependencies
installed and assets compiled, which you can distribute to users that don't
care about or don't want to mess around with the build tools.

If I want to build a native application (like git, which I often do) from
source, it's not like the situation is much better. I have to install the
compiler obviously, and I have to manually install the dependencies.

~~~
paulfurtado
You usually don't have to manually install a compiler and all the
dependencies. If you want to build a package from source which has already
been packaged by the distribution, it's usually quite easy to do so. For
example, in Arch Linux you can run

yaourt -Sb git

and it will get the PKGBUILD file for the package and allow you to edit it
before building so you can build a newer version or alter build flags. When
building a package this way, it will also fetch all of the packages it depends
on in order to build and run. You generally don't have to think about
dependencies at all.

In Ubuntu/Debian and Fedora/CentOS it is several more commands, although there
are probably scripts that do it in a single command. It's still a lot easier
than manually hunting dependencies and figuring out how exactly to build a
package.

~~~
icebraining
_In Ubuntu /Debian and Fedora/CentOS it is several more commands_

In Debian & derivatives it's just "apt-get build-dep git && apt-get source
--build git".

------
cloudsloth
"265 unique packages, all to build jquery!"

Oh my goodness.

~~~
madeofpalk
To be fair, Node/Javascript prefers to have many small packages that do just
one thing (hopefully well).

~~~
AgentME
Yeah, I have several modules on NPM that are each just a single function (with
unit tests!) that I'm interested in using in more than one package. "265
unique packages" could possibly be "265 short functions" on one extreme end.

~~~
moron4hire
To what end? It seems like the package overhead would destroy any savings in
package size to be able to mix and match rather than just having a larger,
more complete package.

~~~
AgentME
What package overhead? With NPM, all a package needs is a "package.json" file
which does little more than specify a name and version number.

And what would a more complete package look like? "AgentME's bag of useful
functions"? Many of them don't make sense to group together.

------
nailer
To always use the same versions of every dependency you'll want to shrink
wrap.

And yes, of course dependencies have dependencies. Software depends on other
software. All packaging systems have a tree of dependencies.

~~~
davexunit
>To always use the same versions of every dependency you'll want to shrink
wrap.

Does "shrink wrap" mean "bundle"? If so, I strongly disagree. It's possible to
explicitly use the exact same versions of dependencies without resorting to
bundling all the source/binaries. Guix and Nix are particularly well suited
for this.

>And yes, of course dependencies have dependencies.

But you have to admit that the depth of this dependency tree is insane.

~~~
nailer
I mean npm shrinkwrap.

Depth of tree is reasonable: if you install them at the top.

------
miralabs
does it mean that if I have 10 modules that depends on exactly the same
version of another module (e.g. dependent 1.1.1).. npm will download 1.1.1 10
times? Wouldn't it just use the single package with the correct and share it
across? Or the cache is main use to copy the dependent module 10 times.

~~~
AgentME
The same version of the same package isn't ever downloaded multiple times.
That's cached. Packages may be installed in multiple locations though.

NPM doesn't reinstall dependencies that are already above it in the tree. If A
depends on B and C, and B depends on (an overlapping version range of) C and
D, then B and C will be installed first in A's node_modules directory, and
then NPM will install the dependencies of B and C in their own node_modules
directories, except that it will avoid installing a copy of C in B's
node_modules directory because it already exists in a higher folder.

However, if B and C both depend on D, then NPM will install D in each's
node_modules folders. You can run `npm prune` (or `npm dedupe`? I can't
remember) to make it lift D up a folder so it only needs one copy.

~~~
pluma
I think `dedupe` is now (i.e. in the most recent version of NPM) part of the
install process, actually. So the only situation in which you end up with two
copies of the same dependency is if two of your dependencies depend on
incompatible versions of the same sub-dependency.

