
JavaScript Libraries Are Almost Never Updated Once Installed - zackbloom
https://blog.cloudflare.com/javascript-libraries-are-almost-never-updated/
======
BiteCode_dev
If there are no security problems affecting your project, and you don't
benefit from new features, there is no urge to update. And even then, it's a
cost/benefit ratio.

Updating is a constraint, one you follow either because it solves a problem or
avoid a pitfall, not a goal.

I migrated most of my stuff to Python 3 years ago, but for some of my clients,
despite that 2.7 is not supported anymore, I advice to just download a copy of
everything they need offline as a safety measure and keep the system running
as-is forever.

Raw HTML + jQuery + Django 1.x + VPS still deliver fine. Just because I can do
react + microservices does not mean it should be a default response.

There are servers running barely touched from one or two decades ago and they
do their job perfectly.

So yes, some client side libs are going to be there for years and years. If
the UI works, it's alright.

Software is not an end, it's a tool. Being passionate about it, it's easy to
forget that, but we use software to help the real world, not the other way
around.

Not saying of course you should never update nor use modern technology when
appropriate: technical debt must be paid. But it's important to recognize when
you need to do it, and when you just want a shiny toy or to look good.

------
onion2k
This applies to libraries installed using <script src="://cdnjs.com/path-to-
lib.js">. Most cdns have a 'latest' path but using that can result in your
site breaking when a dependency changes. Web developers need to stop taking
the lazy option and test more.

JS libraries installed using npm to be bundled in are pinned to the major
version, so they usally update to the latest minor or patch whenever the thing
is deployed. That's still a problem because it only works if an app is
actively maintained and deployed regularly, but it's better than nothing.

~~~
cm2187
It is a myth that all websites (or applications) have an active development
team continuously deploying updates. I would even expect that the vast
majority of the code in production in the world is old code that is not looked
after. This is particularly true for small non-tech companies that outsourced
the development, and for internal systems in large organisations where the
developers have been re-assigned to other projects.

This is why technologies that assume active development, like containers,
cloud APIs, etc are in my opinion a disaster in the making.

~~~
russellendicott
> This is why technologies that assume active development, like containers,
> cloud APIs, etc are in my opinion a disaster in the making.

I can agree with this. I've already been burned a few times _by my own tools_
trying to push an update 8 months after building it the first time. A docker
build as part of my CI/CD will download some new package and bork everything.

I know that this can be prevented with pinning, container registries, etc. but
I often don't build all of that overhead into these stupid little web
services.

I know this may cause immeasurable pain for someone in the future but there's
only so much time in a day.

If this was just some *nix box that "just worked" and never got patched it
would probably work better long term.

~~~
rodw
> A docker build as part of my CI/CD will download some new package and bork
> everything.

Really? I'm not much of an ops guy but that's precisely the problem docker
_fixed_ for me. We must have very different usage patterns.

I used to think bare linux guy, but when I finally bit the bullet and first
fully dockerized the personal stuff I was working on at the time was after the
Nth time of running into library/service version problems every time I stood
up a new AWS (plain-linux EC2) instance.

In fairness I never really "pinned" to specific AWI, which was probably the
root of my problem, but, you know, security updates and stuff seem helpful.
But every time I installed on a fresh box I'd run into incompatibilities with
really core, basic stuff (it seemed to me) like PostgreSQL versions and so on.

I'm still not much an ops guy, but version/install/configuration stability is
by far the most appealing thing about docker for me personally.

I do fair bit of audio and visual processing stuff, typically developed on Mac
and deployed to Linux. With some services every 6 months I would need to hunt
down some wholly new instructions for installing some third-degree dependency
I was using (indirectly) and an entirely different set of build problems to
work out when deploying to a (linux) server.

But since switching to Docker I have had little to no (surprise) compatibility
problems. sox? alsa-utils? libvips? ffmpeg? I can't remember the last time I
had one of those "oh no, I guess I need to spend 4 hours spelunking forum
posts to figure out which precise combination of dependencies need to make
this work like it did a week ago" moments. I am surprised by (a) how _few_
problems I have making sure that native-code-heavy A/V that runs in docker on
my OSX laptop works without modification when I move to a headless linux
server.

------
karimmaassen
I never understood the need (desire) to update your dependencies as much and
often as possible. As long as a project builds and runs without any issues,
why put effort in upgrading libraries and exposing yourself to possible
issues.

~~~
cookiengineer
> why put effort in upgrading libraries and exposing yourself to possible
> issues.

Isn't this what the combination of a "package manager" with semantic
versioning scheme actually should automate?

Personally, I think this is where NPM failed and still fails. They should
enforce a binary format that ships with header files (and method signatures),
and enforce semantic versioning.

If any library doesn't play by the semantic rules, don't let it publish.
That's the authority and responsibility that NPM failed to include.

If everybody plays by the semantic rules ... then libraries can be upgraded
automatically without breaking anything. And a huge plus: They can be
installed as _shared_ libraries, which is such an old concept that it hurts my
fingers having to type its advantages.

~~~
OJFord
How do you enforce it?

Cargo (rust) considered it, using the vastly more information it has about
whether signatures have changed than NPM, but rejected it because you can
still make breaking changes without changing a function signature, so why
claim to detect it if only a subset can be.

~~~
edynoid
You can get pretty far with a statically typed, purely functional language.
For example, Elm's package manager enforces semantic versioning: [https://elm-
lang.org/](https://elm-lang.org/)

I don't think you can do that with JavaScript.

~~~
OJFord
You can always break an API with the values, however much you do with the
types.

Its effectiveness also varies with how much and well types are used - e.g.
whether you return `String` or `Url` to begin with.

------
rodw
"JavaScript Libraries Are Almost Never Updated Once Installed"

...on an website.

I imagine the ease with which one can type `npm i` or `yarn install` means
that server-side JavaScript libraries are updated frequently. Wasn't that the
whole problem with that leftpad thing?

Out of curiosity I went to look for the most frequently downloaded (installed)
library on npm, which apparently is not directly available, but here's a
pretty arbitrary selection of things that came to mind:

* express - last release 8 months - 11.4m downloads this week

* React (which is pretty browser-oriented, right?) - last release 3 months ago - 6.5m downloads this week

* Underscore - last release 1 month ago - 6.7m downloads this week

* jquery (almost exclusively browser-oriented, certainly DOM oriented) - last release 9 mos ago - 3.1m downloads this week

It seems like _someone_ must be keeping up-to-date.

Also "installed" is a misleading statement in the web context, JavsScript
libraries aren't _installed_ they are _integrated_ with the site, which means
an upgrade isn't free, it's a semi-new integration effort.

------
FloatArtifact
I would imagine because Security issues are exposed in dependencies.

------
speedgoose
I try to run `npm-check -u` at least once a week. It takes some time when a
dependency has a breaking change or a project that I don't trust to respect
semantic versioning is updated and I have to check its git diff, but in
practice my projects have up to date dependencies without much efforts.

------
rmist
I believe its mainly inertia and "if it ain't broke, don't fix it" mentality.

