
Many packages suddenly disappeared - xxkylexx
https://github.com/npm/registry/issues/255
======
racbart
PSA: Please be cautious because this is an excellent opportunity for taking
over packages and injecting malware by malicious people.

Example:
[https://www.npmjs.com/package/duplexer3](https://www.npmjs.com/package/duplexer3)
which has 4M monthly downloads just reappeared, published by a fresh npm user.
They published another two versions since then, so it's possible they've
initially republished unchanged package, but now are messing with the code.

Previously the package belonged to someone else:
[https://webcache.googleusercontent.com/search?q=cache:oDbrgP...](https://webcache.googleusercontent.com/search?q=cache:oDbrgPbT5m0J:https://www.npmjs.com/package/duplexer3)

I'm not saying it's a malicious attempt, but it might be and it very much
looks like. Be cautious as you might don't notice if some packages your code
is dependent on were republished with a malicious code. It might take some
time for NPM to sort this out and restore original packages.

~~~
maxander
And all this is happening just as after the public release of a serious
exploit which allows malicious code to do all sorts of nefarious things _when
it is somehow installed on the target machine_. Hmm.

Given that there's hints, at least, that the problems were caused by some
particular developer's actions, I wonder about the security model for package-
managed platforms altogether now. If I were a big cybercrime ring, the first
thing I'd do would be, get a bunch of thugs together and knock on the front
door of a developer of a widely-used package; "help us launch [the sort of
attack we're seeing here] or we'll [be very upset with you] with this wrench."
_Is_ there a valid defense for a platform whose security relies on the
unanimous cooperation of a widely-scattered developer base?

~~~
ytpete
With cases like the current one, or the leftpad incident in 2016, I'm
surprised package registries still allow recycling old package names after a
package was deleted. Really seems like deleted packages should be frozen
forever - if the original author never recreates it or transfers ownership,
then people would have to explicitly choose to move to some new fork with a
new id.

But your point about pressuring or bribing package authors still stands as a
scary issue. Similar things have already happened: for example, Kite quietly
buying code-editor plugins from their original authors and then adding code
some consider spyware (see
[https://news.ycombinator.com/item?id=14902630](https://news.ycombinator.com/item?id=14902630)).
I believe there were cases where a similar thing happened with some Chrome
extensions too...

~~~
mst
> With cases like the current one, or the leftpad incident in 2016, I'm
> surprised package registries still allow recycling old package names after a
> package was deleted.

CPAN requires the old author to explicitly transfer or mark it abandoned-and-
available-to-new-owner.

For all the things wrong with perl5 (and I love it dearly but have spent
enough time I can probably list more things wrong with it than the people who
hate it ;) it's always a trifle depressing to watch other ecosystems failing
to steal the things we got right.

~~~
Moru
This happens all the time. The new generation creates something cool because
what our parents created isnt cool any more, only to fail exactly on the same
spot as our parents. Only, it was already solved in the parents last version.
This goes for cloth design, cars, houses, kitchen wares and so on, as well as
software. Just look at the microwave oven earlier...

~~~
ragesh
Genuine question... What happened with the microwave oven?

~~~
Momquist
I think the GP is refering to this:
[https://news.ycombinator.com/item?id=16089865](https://news.ycombinator.com/item?id=16089865)

Modern microwave ovens have all adopted impractical and quirky new UIs, when
the old concept of knobs was simple and worked fairly well in the first place.

~~~
Moru
My oldest one was just two dials, the second one, 15 years old had loads of
buttons and stuff, really stupidly spread out, you had to press watt, minutes,
seconds, start and start was not in a corner, not in top/bottom row or any
other logical place so you had to search it every time. I glued a rubber piece
to it so I could find it again without having to bend down and search.

Since then I have made sure the microwave has two dials, one for time, one for
effect.

~~~
YouKnowBetter
Remember the watercooker that had just an on & off switch?

Then came one with an option button for 80 or 100 degrees (176 or 212, in
freedoms). Never knew I needed that, but that just changed my live and I can
not do without it. Reason: 80 degrees water is hot enough for my needs and
saves time.

Our latest has 3 buttons, with different possiblities, beebs like a maniac
when ready (an option which is not unset-able) and can do things I never knew
anyone would need (like keeping it at x degrees for y minutes).

I guess it is like evolution: you experiment, keep what works and get rid of
all things unfit.

------
seldo
Hi folks, npm COO here. This was an operational issue that we worked to
correct. All packages are now restored:

[https://status.npmjs.org/incidents/41zfb8qpvrdj](https://status.npmjs.org/incidents/41zfb8qpvrdj)

~~~
yashap
Were any of the deleted packages temporarily hijacked? It seems strongly like
this was the case. If so, please confirm immediately so people who installed
packages during this time can start scanning for malware.

Even if the answer is “yes, 1+ packages were hijacked by not-the-original
author, but we’re still investigating if there was malware”, tell people
immediately. Don’t wait a few days for your investigation and post mortem if
it’s possible that some users’ systems have already been compromised.

~~~
electric_sheep
I would also hope for and expect this to be communicated ASAP from the NPM org
to its users.

@seldo, I understand that you don't want to disseminate misleading info, but
an abundance of caution seems warranted in this case as my understanding of
the incident lines up with what @yashap has said. If we're wrong, straighten
us out --- if we're not, please sound an advisory, because this is major.

~~~
yashap
Yeah, these were some core, widely used packages that were deleted. If they
were temporarily hijacked, lots of dev machines (including mine) may have been
compromised. There’s a major security risk here, if there was any hijacking
now is not the timing for information hiding and PR.

------
rootlocus
> I was here.

> We made history! Fastest issue to reach 1000 comments, in just 2 hours.

> cheers everyone, nice chatting with you. 17 away from hitting 1000 btw!

> Is GitHub going to die with the volume of comments?

Kind of disappointed the NPM community is turning github into reddit right
now.

~~~
TeMPOraL
There's probably a large overlap between the two communities.

~~~
xauronx
Considering almost every human I know uses Reddit in some capacity (technical
and non-technical), that's pretty likely.

~~~
adjkant
You're in a really self-selecting crowd then. Less than half the people I know
use it, mostly because my social group is outside of the tech world.

~~~
TimothyBJacobs
Reddit is in the top 10 most popular websites according to Alexa. I'd venture
to say most reddit users aren't people in the tech world.

~~~
k__
I know exactly one person who uses it. To me it always seemed like 4chan-
light.

------
nukeop
NPM is extremely vulnerable to typosquatting. Be cautious with what you
install. The install scripts can execute arbitrary code. NPM's team response
is that they hope that malicious actor won't exploit this behaviour. According
to my tests, typosquatting 3 popular packages allows to take over around 200
computers in 2 weeks time it takes their moderators to notice it.

~~~
swang
this is not a response?

[http://blog.npmjs.org/post/168978377570/new-package-
moniker-...](http://blog.npmjs.org/post/168978377570/new-package-moniker-
rules)

~~~
nukeop
That's okay, but it's not enough - it's easy to swap two letters and do
similar substitutions to fool many users. If a package is downloaded 10,000
times every day, surely once in a while someone will misspell the name
somehow.

Other than that, their reaction to similar incidents was to wait for somebdoy
on twitter to notify them, ban the responsible users, and hope that it won't
happen again. It's still extremely exploitable and there are surely many other
novel ways of installing malware using the repository that we haven't even
heard of yet. The NPM security team is slow to act and sadly doesn't think
ahead. They're responsible for one of the largest software ecosystems in the
world, they should step up their game.

~~~
eropple
Yup. The best answer I can come up with given their constraints (some self-
imposed) is to force all new packages to be scoped.

~~~
johnny22
how many typosquats on scope names will there be I wonder.

------
sergiotapia
Yikes, what is it about node/npm/javascript that makes it feel like a house of
cards?

~~~
foepys
I have recently taken over an Angular project (with a C# backend, thankfully)
at my job. It took two hours to get it to even compile correctly because some
dependencies were apparently outdated in package.json and it just ran on the
other dev's machine by accident. I don't understand why I need over 100
dependencies for a simple Angular Single Page App that pulls JSON from the
backend and pushes JSON back. Meanwhile, the C# backend (a huge, complicated
behemoth of software) ran on the first click.

~~~
ep103
Three developers on my team spent the last 4 years pushing for angular. Four
years ago, I was 50/50 on it vs react, so whatever, but if my team's really
for it, let's do it.

Fast forward to angular 2, and we're down to two developers who are still for
it.

Fast forward to today, I'm down to one angular dev who's still for it, and two
of the original three have left for react jobs. Meanwhile, I'm left with a
bunch of angular 1 code that needs to be upgraded to angular 2, and a few
testing-out-angular-2 projects that are dependency hell.

The only reason I ultimately embraced angular 1 to begin with (above reasons
aside), was because it was so opinionated about everything, I could throw it
at my weaker developers and say: "just learn the angular way to do it", and
there was very little left they could meaningfully screw up. Angular
proponents on the team would see it as a point of expertise to teach the
"angular way" to more junior devs, and everyone left the day feeling good.

When it comes to Javascript 95% difficulty with writing good maintainable code
is ensuring that your team is all writing to a very exact, and consistent
quality and style, since there are so many different ways you can write js,
and so many potential pitfalls. And if the team all wants to embrace Google's
Angular standard, that works for me. Its far easier to be able to point to an
ecosystem with an explicit, opinionated way of writing code, than it is to
continuously train people on how to write maintainable code otherwise.

But with angular 2, if you haven't been drinking to cool-aid for a while now,
it requires so much knowledge just to get running, I can't even have junior
devs work on it, without a senior dev who's also an angular fanboy there to
make sure everything is set up to begin with. Its absurd. And I'm supposed to
sell to the business that we need to migrate all my Angular 1 code to this
monstrosity? And then spend time again every 6 months making the necessary
upgrades to stay up to date? Get real.

~~~
Yuioup
I don't understand. We've started a new Angular 2+ project and our junior
developers managed to roll into quite easily. Our designers (who know jack
about Javascript) got excited when they discovered that our project uses .scss
and the results have been spectacular.

Seriously, I REALLY REALLY don't get this hate for Angular 2+

~~~
jbg_
Just wait until Angular 2 hasn't been cool for a while and you can't find any
JS developers who are interested in maintaining your software rather than
rewriting it in xyz_latest_fad_framework.

------
JepZ
Btw. for those who don't know:

Yarn (which is an alternative to npm) uses a global cache [1] on your machine
which speeds things up, but probably also protects you from immediate problems
in cases like the one currently on progress (because you would probably have a
local copy of e.g. require-from-string available).

[1]
[https://yarnpkg.com/lang/en/docs/cli/cache/](https://yarnpkg.com/lang/en/docs/cli/cache/)

~~~
ris
Already counting down the days before yarn is considered old and broken and
people are recommending switching to the next hot package manager/bundler...

~~~
hateduser2
It badfles me that technologists commonly complain about new technology. As
far as I can tell your complaint boils down to “people should stop making and
switching to new things”.. I find it hard to understand why someone with this
attitude would be a technologist of any kind, and I find the attitude really
obnoxious.

~~~
TeMPOraL
Because each thing has a constant price in learning effort that is
familiarizing yourself with its idiosyncrasies, which you have to pay even if
you're experienced in the domain. When tools constantly get replaced instead
of improved, you keep paying that price all the time.

~~~
klibertp
> Because each thing has a constant price in learning effort

That's not, in my experience, how it works. Learning your first tool (or
language) takes a lot of time. Learning your second is quicker. By the tenth,
you're able to learn it by skimming the README and changelog.

It works like this for languages too, at least for me. My first "real"
language (aside from QBasic) was C++ and it took me 3-4 years to learn it to
an acceptable degree. Last week I learned Groovy in about 4 hours.

It still "adds up", but to a much lower value than you'd think.

~~~
TeMPOraL
But it does, you're just focusing on the other component of learning.

Put another way, for a new tool, learning cost is a sum of a) cost of learning
idiosyncrasies of that tool, and b) cost of getting familiar with the concepts
used by it.

You're talking about b), which is indeed a shared expense. But a), by
definition, isn't. And it's always nonzero. And since new tools are usually
made to differ from previous ones on purpose ("being opinionated", it's
called), even though they fix some minor things, this cost can be meaningful.
And, it adds up with every switch you need to do.

Some of it is a normal part of life of a software developer, but JS ecosystem
has taken it to ridiculous extremes.

~~~
klibertp
My argument is that the a) part's cost is indeed non-zero, but - contrary to
what you say - trivial in a vast majority of cases. It's just my personal
experience, but it happened every single time I tried to learn something:
learning "what" and "why" took (potentially a lot of) time, but learning "how"
was a non-issue, especially if a "quick reference" or a "cheat sheet" was
available. I also disagree that the a) part is never shared between tools:
there are only so many possible ways of doing things, but a seemingly infinite
supply of tools for doing them. The idiosyncrasies are bound to get repeated
between tools and, in my experience, it happens pretty often.

As an example, imagine you're learning Underscore.js for the first time. It's
a mind-blowing experience, which takes a lot of time because you have to learn
a bunch crazy concepts, like currying, partial application, binding, and
others. You _also_ have to learn Underscore-specific idiosyncrasies, like the
order of arguments passed to the callback functions and the like - mostly
because you are not yet aware which things are important to know and which are
just idiosyncrasies.

Now, imagine you know Underscore already and have to learn Lo-dash or
Ramda.js. As the concepts remain very similar, you only need to learn a few
conventions, which are different in Ramda. But! Even then, you don't have to
really learn all of them to use the library effectively. It's enough to keep
the diff of the Underscore and Ramda conventions in mind: learning that, for
example, the order of arguments passed to callbacks differ is enough; you can
then check the correct order in the docs whenever you need. You know where to
find that piece of information, you know when it matters and, by extension,
when it's not a concern. There is no need to waste time trying to learn
trivia: not doing something is always going to be the fastest way of doing it.
By your second library, you start to recognize trivia and are able to separate
it from informations that matter. Learning prelude.ls afterward is going to
take literally 30 minutes of skimming the docs.

This is just an example, but it worked like that for me in many cases. When I
switched from SVN to Bazaar, for example, it took quite a bit of time to grok
the whole "distributed" thing. When I later switched from Bazaar to Git it
took me literally an hour to get up to speed with it, followed by a couple
more hours - spaced throughout a week or two - of reading about the more
advanced features. Picking up Mercurial after that was more or less automatic.

I guess all of this hinges upon the notion of the level of familiarity. While
I was able to use bzr, git and hg, it only took so little time because I
consciously chose to ignore their intricacies, which I knew I won't need (or
won't need straight away). On the other hand, you can spend months learning a
tool if your goal is a total mastery and contributing to its code. But the
latter is very rarely something you'd be required to do, most of the time the
level of basic proficiency is more than enough. In my experience, the cost of
reaching such a level of proficiency becomes smaller as you learn more tools
of a particular kind.

That's the reason I disagree with your remark that that cost is "constant".
It's not, it's entirely dependent on a person and the knowledge they
accumulated so far. Learning Flask may take you a week if you're new to web
development in Python, but you could learn it in a single evening if you
worked with Bottle already. On a higher level, learning Elixir may take you
months, but you could also accomplish it in a week, provided that you already
knew Erlang and Scheme well.

So that's it - the cost of learning new tools may be both prohibitive and
trivial at the same time, depending on a prior knowledge of a learner. The
good thing about the "prior knowledge and experience" is that it keeps growing
over time. The amount of knowledge you'll have accumulated in 20 years is
going to be vast to the extent that's hard to imagine today. At that point,
the probability of any tool being genuinely new to you will hit rock bottom
and the average cost of switching to another tool should also become
negligible.

To summarize: I believe that learning new tools gets easier and easier with
time and experience and - while never really reaching 0 - at some point, the
cost becomes so low that it doesn't matter anymore (unless you have to switch
really often, of course).

------
izacus
Hmm, I Java world we pretty much always used a local (company-owned) Maven
proxy server, which grabbed packages from public repos and cached them locally
to make sure builds still work if public servers were down or slow... or
packages disappeared.

This isn't a standard practice in JS world?

~~~
tomjakubowski
I’ve worked at places where the Java devs used Maven Central directly. I’ve
also worked at a place where the Node devs use an on-premises copy of
dependencies for builds and deploys.

It might not be as standard a practice in the Java world as you think.

~~~
brown9-2
Where did those Java devs who pulled from Maven central directly publish their
artifacts?

~~~
stephengillie
Possibly Sonarqube Nexus. The Java devs at my workplace use Sonarqube along
with Jenkins and Maven on the same server. I believe they communicate through
the shared directory on the file system.

(Pet peeve: _another_ product named "Nexus". Please choose original names for
your software.)

~~~
trampi
It's Sonatype Nexus, Sonarqube is the code quality checker.

------
krzyk
So they didn't learn anything from left-pad situation from 1.5 year ago?

Packages that are published should be immutable, just like in maven repo case.

~~~
christophilus
They don't allow removal of packages. This is likely a cascading storage
failure or something along those lines (or else a major hack).

------
throwaway66666
I never understood the love for package managers that directly hook and import
things into your codebase or repo or even worse servers. I guess the benefit
is that "it just works", but the fact that you do not know where a package is
coming from can't be worrying just me.

In my company we take the stable version of the library we want to use and we
self-host it. We basically have added a cache that we manage and control what
goes into it instead of just trusting a manager. Especially for server-side
deployment this is mandatory for security. Things like let's say ffmpeg etc -
we never get from random packages but we host them ourselves.

~~~
greenyoda
Just today, someone posted an article about how malware can be distributed via
npm:

"Harvesting credit card numbers and passwords from websites"

[https://news.ycombinator.com/item?id=16084575](https://news.ycombinator.com/item?id=16084575)

If you self-host a stable version, you'll have some time to hear about
potential problems in a new version before updating it.

------
_Marak_
We really need to hear from NPM why this happened.

There is currently no way for a user to remove their own packages or unpublish
packages anymore from the public NPM API ( a change following the `left-pad`
incident ).

This leads me to believe this was an internal NPM error. My guess is employee
error.

~~~
8n4vidtmkvmk
Whaaa...? I swear I used `npm unpublish` several times in the past year.

Yeah, it definitely exists:
[https://docs.npmjs.com/cli/unpublish](https://docs.npmjs.com/cli/unpublish)

~~~
_Marak_
Only for a version less than 24 hours old. You can no longer remove
established packages.

A quote from the documentation page you linked:

> With the default registry (registry.npmjs.org), unpublish is only allowed
> with versions published in the last 24 hours. If you are trying to unpublish
> a version published longer ago than that, contact support@npmjs.com.

~~~
IncRnd
> _Only for a version less than 24 hours old. You can no longer remove
> established packages._

Apparently, someone can remove established packages.

------
xxkylexx
> Update - Most of the deleted packages have been restored and installation of
> those packages should succeed. Nine packages are still in the process of
> restoration. > Jan 6, 20:12 UTC

[https://status.npmjs.org/incidents/41zfb8qpvrdj](https://status.npmjs.org/incidents/41zfb8qpvrdj)

~~~
djsumdog
From the Github issue:

> Beginning at 18:36 GMT today, 106 packages were made unavailable from the
> registry. 97 of them were restored immediately. Unfortunately, people
> published over 9 of them, causing delays in the restoration of those 9. We
> are continuing to clean up the overpublications. All installations that
> depend on the 106 packages should now be working.

Hard to believe less than a hundred packages cause so many issues. NPM's
dependency hierarchy is pretty insane.

~~~
zbentley
> Hard to believe less than a hundred packages cause so many issues. NPM's
> dependency hierarchy is pretty insane.

Ever heard of glibc?

------
antonkm
Gah. Moments like these always gives me a bit of panic, since I realize that
so much of my software relies on external sources.

Relying on npm, Atlassian/GitHub etc really hurts when stuff like this
happens. Issues always gets resolved, but cases such as the GitLab incident
should be enough to always keep some local copies around.

~~~
mschuster91
> Gah. Moments like these always gives me a bit of panic, since I realize that
> so much of my software relies on external sources.

Install an instance of Sonatype Nexus, create a proxy-repo for npm (and Maven
if you also use Java) and that's it.

What, however, won't be caught is Docker (because that crap insists on
directly talking to the Dockerhub servers, which is a giant security hole
waiting to happen) and PHP composer (because it likes to pull dependencies via
git from GH, so no caching there).

~~~
panarky
Or just don't .gitignore node_modules, then diff any changes to node_modules
on update.

~~~
mschuster91
Does not work as soon as you use node modules that come with native components
that have to be recompiled for the machine, and there are _many_ of these.

Colleagues have been bitten by this - one used OS X 10.11, the other 10.12,
and they experienced weird bugs from this. Went away once they kicked out
node_modules from git.

~~~
tomjakubowski
Yeah, it’s an annoying problem. Maybe you could gitignore the *.node (the
native module file extension) files only. But I’m not sure how you’d rebuild
those “on demand” after a checkout without running 'npm install' from the top
level.

~~~
nawitus
I suppose npm rebuild would work.

------
chrisper
Could someone please add NPM to the title?

~~~
xxkylexx
I had "NPM Registry:" in the title originally, but someone edited it.

------
chrisweekly
> " Several packages including "require-from-string" are currently
> unavailable. We are aware of the issue and are working to restore the
> affected user and packages. Please do not attempt to republish packages, as
> this will hinder our progress in restoring them. Posted 4 minutes ago. Jan
> 06, 2018 - 19:45 UTC

"

[https://status.npmjs.org/incidents/41zfb8qpvrdj](https://status.npmjs.org/incidents/41zfb8qpvrdj)

------
olingern
Late to the party, but can't wait for the technical write up on this.

I think npm has been a headache for everyone at some point, which is one of
the main reasons I started contributing to Yarn. I think npm has done a lot of
good work in the past year to respond to the necessary change, so kudos to
them for their work; however, it's nowhere near the rock-solid package manager
that we need. If the Javascript ecosystem is to ever be taken seriously, and
not as a toy -- it has to have more reliability.

Ergonomically, I currently thing it's ahead of many other package managers
because of how simple it is to get running. The number of "gotcha's" after
_npm install_ is nothing to shake a stick at, though.

One of the things you can do to get builds that aren't as suspect to npm
registry issues is configuring an offline mirror [1].

From the post:

" _Repeatable and reliable builds for large JavaScript projects are vital. If
your builds depend on dependencies being downloaded from network, this build
system is neither repeatable nor reliable.

One of the main advantages of Yarn is that it can install node_modules from
files located in file system. We call it “Offline Mirror” because it mirrors
the files downloaded from registry during the first build and stores them
locally for future builds._"

1 - [https://yarnpkg.com/blog/2016/11/24/offline-
mirror/](https://yarnpkg.com/blog/2016/11/24/offline-mirror/)

------
sebazzz
Can anyone explain why the npm registry still exists if it cannot guarantee
that uploaded packages remain available? The current state makes it pretty
useless as a reliable source te base software on because you never know if
you're able to build it again in the future.

They should take a good hard look at NuGet, which does not allow packages to
be deleted so builds are guaranteed to be reliable. Still doesn't hurt to
locally cache packages with software such as Klondike.

~~~
ilaksh
They don't allow packages to be deleted. A bug or server issue or mistake
caused this. This type of problem has occurred only once before. In general
npm has been extremely reliable and performant.

~~~
paulddraper
When was the only other time this problem happened?

------
xxkylexx
[https://status.npmjs.org/incidents/41zfb8qpvrdj](https://status.npmjs.org/incidents/41zfb8qpvrdj)

------
fiatjaf
How are these going, by the way?

\- [https://github.com/elsehow/gx-js](https://github.com/elsehow/gx-js)

\- [https://github.com/ipmjs/ipmjs](https://github.com/ipmjs/ipmjs)

\- [http://everythingstays.com/](http://everythingstays.com/)

\- [https://github.com/pnpm/pnpm](https://github.com/pnpm/pnpm)

\- [https://github.com/nginnever/ippm](https://github.com/nginnever/ippm)

------
jimjimjim
I said this back in the left pad days

Store all of your dependencies locally.

If something disappears then at least you can continue until you find a
replacement.

~~~
tootie
What do folks use these days? Artefactory? Nexus?

~~~
richardknop
You can store dependencies in version control so you can continue working when
there is problem with remote package manager repositories as you just checkout
last working version with all dependencies from git.

------
andrewaylett
While it may not be the right time to _start_, incidents like this are an
excellent reason to consider an internal read-through proxy package
repository. The last couple of organisations I've worked with have used
Artifactory: [https://jfrog.com/artifactory/](https://jfrog.com/artifactory/)

------
z3t4
And ppl think i'm crazy for keeping packages in SCM repo. NPM get so much
abuse, people depending on them without paying a dime. At least put up a
caching proxy hosted by your own if you depend so much on npm for your
operations.

~~~
sashaafm
I've been thinking that it's a good idea to do that lately...

------
jogjayr
In my org, we use Artifactory as a cache between us and external sources. They
have a free version too. I'd encourage everyone to use it, or something like
it. Stop pointing your package managers to the public registry.

~~~
matharmin
What gives you more confidence in them? Just a better track record, or is it a
fundamentally more reliable model?

~~~
bastijn
It is fundamentally more secure as it functions as a private controlled proxy
for the public repo. Also solves some other nice gotchas such as people
pulling a left-pad joke on you and reproducible installs as all packages are
cached so your build servers and dev systems get the same version of all
packages (if properly used with shrinkwrap kind of solutions, or even without
if properly handled).

------
xab9
With npm 4 things went south and never came home to us. We use macs, pcs,
linux machines and nowdays we fear `npm i` like the plague. I don't care if
it's the registry, the executable, the stupid packagelock.json, node-npm
version mismatch or an installer script, the endresult is frustration.

------
taurath
This is insane. This is like Google changing their v1 APIs, except worse since
ANYONE could come in and put new malicious APIs up in its place. I say this as
a firm supporter of Node and the ecosystem - this should NEVER EVER be allowed
to occur. This completely erodes the trust model based around "popular
packages" even further - the only saving grace is that hopefully most devs are
shrinkwrapping their modules.

------
Iv
I wish the NPM community would grow some humility and learn some lessons about
how the debian environment was built. Have sane licensing that allows
mirroring, have crypto-hash of packets. Have open governance.

------
mirekrusin
2018 looks interesting, everything seems suddenly broken.

~~~
hrpnk
broken concepts get re-validated :)

------
ris
And _this_ is why I avoid "package managers" that follow the wild-west model
like the plague.

~~~
zbentley
What's the wild-west model? Write it all yourself? "cp -r $dependency_location
$install_location"? Genuinely curious.

~~~
ris
The "wild west" model is where there is no maintainer or distributor between
the developer and consumer that is allowed to perform any sort of quality
control or sanitisation. That sounds good from a naive standpoint - who needs
this busybody middleman anyway? But the problem is that authors tend not to be
great maintainers. Authors can (and do) remove packages at any time, make
changes to packages without bumping version numbers, upload subtly broken
versions or possibly make user-hostile changes which the community can then do
nothing about short of creating a fork (which is messy switching over
dependencies to a different package name). And that's not even to go into
typo-squatting.

In short, package authors don't tend to care about much more than getting
_their_ package to work, somehow, anyhow. Often only the latest version of
that package, too. And they don't always have an eye on interoperability with
other packages, or consistency across a collection. Maintainers who create a
"distribution" of software that works well together can collaboratively make
decisions that are in the community's best interest. The "wild west" model is
unilateral, the "maintained" model is multi-lateral.

~~~
krapp
>The "wild west" model is where there is no maintainer or distributor between
the developer and consumer that is allowed to perform any sort of quality
control or sanitisation.

That's not entirely true. Maintainers or distributors aren't _required_ under
the "wild-west" model, but that's not the same as anything being disallowed.
It's up to the community and the developer to do their own due diligence. The
"wild west" model is just the free software model, it's just the lack of some
central authority limiting user freedom for the good of the community.

Rather, it's the "distribution" model which forbids anything not approved by
the list of official maintainers. All of the problems you list with package
authors still exist, but you have fewer options as a developer should they
arise.

------
msoad
Why Node.js comes with a client for a for profit company is still baffling me.
NPM team has proven time and time again they are not competent enough to
handle this responsibility yet they are given the free ride by the Node.js
foundation.

Node.js package manager SHOULD BE COMMUNITY OWNED/DRIVEN

------
batina
There are over 700 comments on this issue on GitHub. It is turning into live
chat room.

[https://github.com/npm/registry/issues/255](https://github.com/npm/registry/issues/255)

Be part of the history :)

EDIT: there are now over 1100 comments/memes.

~~~
cesarb
The final count was 1263 comments, and no more will be added now that it's
closed and locked.

------
floatingatoll
NPM specifically asks people to not try and republish the broken packages
while they repair it. Incident status:

[https://status.npmjs.org/incidents/41zfb8qpvrdj](https://status.npmjs.org/incidents/41zfb8qpvrdj)

------
meritt
The sheer number of software development organizations who cannot function
when github or their package repository happens to be unavailable (for
whatever reason) is incredibly disheartening.

------
mbrumlow
And this is why you vender your dependencies. Nothing in production should
ever require any external service to build and run.

You should also not be cowboy updating things just because there is an update.

------
themtutty
Was just discussing this elsewhere online. Package management is broken (or
incomplete, depending on your viewpoint). What's needed IMO is the following:

1\. Allow a single package file, including multiple clauses (or sub-files,
whatever) for different languages. Let me manage my Angular front-end and
Flask back-end in the same file. A single CLI tool as well - Composer and
Bower aren't all that different.

2\. Be the trusted broker, with e.g. MD5 checking, virus scanning, some kind
of certification/badging/web of trust thing. Let developers know if it's
listed, it's been vetted in some way.

3\. Allow client-side caching, but also act as a cache/proxy fetch for package
retrieval. That way, if Github or source site is down, the Internet doesn't
come to a screeching halt. I see the value of Satis, but it's a whole
additional tool to solve just one part of this one problem.

4\. Server-side dependency solver. Cache the requests and give instant answers
for similar requests. All sorts of value-adds in analytics here, made more
valuable by crossing language boundaries.

5\. Act as an advocate for good semver, as part of the vetting above.

NOTE: These features are not all-or-nothing, I believe there's value from
implementing each one on its own. Also note that nothing here should lock
people into one provider for these services. There's a market to be made here.

------
Moter8
Could we have a better title perhaps, giving at least "[npm]" or something as
a hint?

------
uallo
Here is the official response from npm:

[http://blog.npmjs.org/post/169432444640/npm-operational-
inci...](http://blog.npmjs.org/post/169432444640/npm-operational-
incident-6-jan-2018)

TL;DR: "no malicious actors were involved in yesterday’s incident, and the
security of npm users’ accounts and the integrity of these 106 packages were
never jeopardized."

A more detailed report will follow in the next days.

~~~
mohsen1
They are claiming issue is resolved while I'm still not seeing a package my
package is depending on

[https://github.com/mohsen1/json-formatter-
js/pull/58#issueco...](https://github.com/mohsen1/json-formatter-
js/pull/58#issuecomment-355840929)

~~~
uallo
I've never used pinkie before. But according to its GitHub page, there is no
version 2.0.5:

[https://github.com/floatdrop/pinkie](https://github.com/floatdrop/pinkie)

It seems that someone took over the package during its absence from npm and
deployed a version 2.0.5. Maybe to avoid any malicious takeover. But there is
no version 2.0.5 anymore.

------
mjrpes
I don't use NPM (or community managed package managers in general), but anyone
know why there isn't an LTS feature with packages? So that, when searching
packages, if a package is flagged as LTS, you know that it and all its
dependencies have long term support and there are contingencies on what
happens if the package is abandoned. Obviously, there would need to be a
community that reviews and approves packages that aim to be LTS.

~~~
styfle
There are unofficial stability badges[0] that I have seen some packages use.
For example, xtend[1] is locked.

[0]: [https://github.com/badges/stability-
badges](https://github.com/badges/stability-badges)

[1]:
[https://www.npmjs.com/package/xtend](https://www.npmjs.com/package/xtend)

------
paultopia
Stupid question from non-pro here: everyone's always like "never commit
libraries into source control." But, um, this kinda thing?

~~~
kstenerud
Don't store libraries in your project's repository. It bloats things like hell
and makes it difficult to navigate the change sets. Set up your own library
cache. Store that cache in its own repo if that floats your boat. Then all of
your projects can get their dependencies from your cache.

~~~
delfinom
Well, if you are in other languages that have an actual standard library (so
you don't need 500 packages to make up for it) and only a few dependent
libraries that are well package and you don't need to update frequently.

Just commit them, don't make things a mess.

------
EGreg
Sometimes your project becomes bigger than you, and perhaps private ownership
isn't the best way to handle that anymore:

[http://magarshak.com/blog/?tag=identity](http://magarshak.com/blog/?tag=identity)

 _Well, Q allows you to choose between “each individual publishes their own
stream” and some degree of “centralized publishing” by management teams of
groups. So who should publish a stream, the individual or the group?

If the individual - the risk is that the individual may have too much power
over others who come to rely on the stream. They may suddenly stop publishing
it, or cut off access to everyone, which would hurt many people. (I define
hurt in terms of needs or strong expectations of people that form over time.)

If the group - then managers may come and go, but the risk is that if the
group is too big, it may be out of touch with the individuals. The bigger risk
is that the individuals are forced to go along with the group, which may also
create a lot of frustration. For instance, the group may give rise to into
three sub-groups. They are deciding where to go, but some people want to go
bowling, others want to go to the movies, others want to volunteer in a soup
kitchen. Even though everyone belongs to the group. Who should publish these
activities?

So I think when it comes to publishing streams that others can join, there
should be some combination of groups and individuals. And it should reflect
the best practices of what happens in the real world: one person starts a
group that may later become bigger than him. Then this group grows, gets
managers etc. After a while this person may leave. In the future, other
individuals may want to start their own groups and invite some members of the
old group to join. They may establish relationships between each other,
subscribe to each other’s streams, pay each other money, etc._

------
the_duke
Several packages just disapperead, now some are re-appearing, potentially
uploaded by differnt users (!).

See
[https://github.com/npm/registry/issues/255](https://github.com/npm/registry/issues/255)
for details.

Very annoying, breaks builds all over, also prevents installing react-native.

------
0x0
The status page at
[https://status.npmjs.org/incidents/41zfb8qpvrdj](https://status.npmjs.org/incidents/41zfb8qpvrdj)
says that "We apologize for the temporary unavailability of some packages.".

If this was only a matter of missing packages, this would "only" be a matter
of breaking builds.

But it looks like third parties were able to take over the missing packages,
see
[https://github.com/npm/registry/issues/256](https://github.com/npm/registry/issues/256)
\- which is a HUGE deal, considering "npm install" blindly executes the
scripts in a package's preinstall property (as well as the packaged module
itself possibly containing arbitrary backdoors)

------
partycoder
This is why you should depend on exact versions whenever possible. But even if
you do, your dependencies most likely won't, so you are screwed anyways.

The caret syntax for auto-upgrading to the next minor version is the open door
to a world of bullshit.

------
danso
Does anyone know what happened to the author? It seems he was still on Twitter
as of yesterday (Jan. 5), responding to someone about a merge request:

[http://archive.is/JEG90](http://archive.is/JEG90)

------
yladiz
I don't remember the intricacies of NPM or Yarn, but don't one/both of them
have resource integrity enabled, so that you know that the package that's
being installed is the one in your lock file? If not, why isn't this a feature
especially after the clusterfuck of the guy deleting all his packages back
about two years ago, breaking tons of things including Babel and React?

This wouldn't fix the issue of someone deleting the actual package (this
happened here?), but it would prevent some malicious code being installed if
someone uses the same package name.

~~~
applecrazy
Can you link me to the incident where a person deleted their packages and
broke Babel? I'd love to read about it.

Edit: grammar

~~~
mort96
[https://www.theregister.co.uk/2016/03/23/npm_left_pad_chaos/](https://www.theregister.co.uk/2016/03/23/npm_left_pad_chaos/)

left-pad was a package to, you guessed it, pad a string with n leading
characters. Personally, I've always just written my own 2 line function for it
(something like `function pad(s, n, ch) { return new Array(n -
s.length).fill(ch).join("") + s; }`), but a bunch of packages either directly
or indirectly depended on this left-pad package, so they all broke.

~~~
applecrazy
Packages broke because of a _literal two line function?_ That's hilarious and
terrifying at the same time.

~~~
mort96
Well... No. the left-pad function is 11 lines. The source code, as it was back
then, according to that register article, was like this:

function leftpad (str, len, ch) {

    
    
      str = String(str);
    
      var i = -1;
    
      if (!ch && ch !== 0) ch = ' ';
    
      len = len - str.length;
    
      while (++i < len) {
        str = ch + str;
      }
    
      return str;

}

But yes, packages broke because of what _could_ have been implemented in one
line (ignoring the two lines for the function signature and closing curly).

------
jlgaddis
Did NPM not learn from the leftpad incident?

------
tonetheman
If you have a build system or a production system that relies on npm to be
there, you are an idiot. Find your boss and tell him to fire you.

Vendor your dependencies that are needed to run build your applications.
Period.

------
edem
I just don't understand how this can happen. In Maven Central for example
(Java) if you publish a package it is immutable and stays there until nuclear
fire immolates the Earth.

~~~
zbentley
Unless I'm misunderstanding something about Central's architecture, it's not
fundamentally different from NPM in this regard, though signing appears a bit
more feasible.

Which means that it's not a technical difference. Maybe Central has been
compromised/had issues before, just long ago (it's certainly much older).
Maybe there are things wrong with NPM-as-a-company even if NPM-as-a-technology
is fine. Maybe it's just luck.

But "stays there until nuclear fire immolates the Earth" sounds a bit much
like "this ship is completely unsinkable" for my liking.

~~~
edem
Maybe I'm a bit biased but I never heard of something like this in Javaland.
So packages are supposed to be immutable on npm as well?

------
peterwwillis
You know, back in the "old days", we used to host packages on these sites
called "mirrors", so when one went down, we could get the package from
another, and verify authenticity using multiple sources and signed files.
There would be hundreds of mirrors for one set of files.

Kind of funny how shitty modern technology is. But I heard a quote recently
that kind of explains it: "The more sophisticated something is, the easier it
is to break it."

~~~
christophilus
Yup. That's a good solution. P2P would probably be a decent solution, too--
bit torrent or block chain or whatever variant.

------
gargravarr
How to bring down half the internet - randomly delete your NPM packages, then
stand back and watch millions of web developers scream in frustration.

What an ecosystem we have built.

------
krapp
This may be a stupid question - I'm not that familiar with NPM or modern
javascript development so forgive me, but does it not allow storing your
dependencies locally? Is that not considered best practice? Just download your
entire dependency tree and don't touch it unless you have to.

It seems to me that if packages "disappear" from upstream, it shouldn't have
any effect other than preventing an update due to the missing dependency.

~~~
christophilus
It does store them locally. I think the problems here are:

\- The missing packages can be replaced by someone who wasn't the original
package author (e.g. a malicious hacker) \- It's not easy to catch this ^^^
because NPM doesn't have support for signing versions in your project's
dependency configuration... (I bet it will after this.) \- Almost every modern
website has a dependency on NPM somewhere in their build chain \- NPM being
down means loads of sites can't deploy properly

So yeah. This may be a really big deal.

~~~
krapp
It might be easier to catch if packages were namespaced by author and package
name, or even directly by URL, the way Composer does with PHP dependencies.
It's easier to spoof 'infinity-agent' than 'floatdrop/infinity-agent' or
'github:floatdrop/infinity-agent'

------
fiatjaf
Well, the packages seem to be all from floatdrop[0].

[0]: [https://www.npmjs.com/~floatdrop](https://www.npmjs.com/~floatdrop)

------
7ewis
Glad we cache ours on ProGet now!

~~~
sheraz
Not a ProGet user here, but this definitely seems like a good idea.

What are the open-source / self-hosting options here? It gets a little messy
with all the sub-dependencies, doesn't it?

~~~
madebyherzblut
If you only need NPM support we are very happy with npm-register[1]. I also
heard good things about verdaccio[2].

Plus you also get private packages.

[1] [https://github.com/jdxcode/npm-register](https://github.com/jdxcode/npm-
register)

[2]
[https://github.com/verdaccio/verdaccio](https://github.com/verdaccio/verdaccio)

------
lamby
The latest Manifest.fm podcast episode is about typosquatting:

[https://manifest.fm/9](https://manifest.fm/9)

------
smoyer
The Maven Central repository for JVM dependencies doesn't share the problem of
packages being removed like NPM periodically has, but Adam Bien has been
instructing users to download the source-code for their dependencies and then
compile them to their own repositories for quite a few years.

I wish I'd taken his advice as there are a couple of JAR files that I can no
longer update.

------
tw1010
I don't understand much about the blockchain, but one thing I have heard is
that it's impossible (or very hard) to remove things from it. It is immutable,
sort of append only, if I understand it correctly. So my question is, is there
anyone working on moving npm to the blockchain? Or doing something like a
package manager on the blockchain? If not, why not?

~~~
fiatjaf
Your idea is awful, but you really shouldn't be downvoted.

It is better to use content hashes and a system that distributes and enforces
these, like IPFS.

Someone could just create some hooks for
[https://github.com/whyrusleeping/gx](https://github.com/whyrusleeping/gx) and
we would have it done.

~~~
j3097736
Already done [https://github.com/diasdavid/npm-on-
ipfs](https://github.com/diasdavid/npm-on-ipfs) tough it doesn't seem to be
mantained anymore.

~~~
fiatjaf
No. That's not what I'm talking about. That's just a way to host your own
snapshot of the entire npm registry. Not a good way to introduce the
decentralization feature of IPFS.

------
MaxLeiter
Couldn’t this be solved in the future by npm storing packages with user names?
I.e. “MaxLeiter/example” instead of “example”

~~~
zbentley
Well, we don't know what caused the issue yet, so we can't say for sure. But I
suspect that whatever family of problem deleted the packages causing this
trouble could just as easily apply to the deletion (and illegitimate
reclamation) of usernames.

------
feduzi
I don't get why not just use git repo registry (e.g. github) for package
management. If you work in a "strict" environment you can basically fork all
your dependencies and use your own git repo registry.

NPM already allows using git repos, but needs some tweaks to allow better
support:

* allow versioning via git tags

* store git commit in `package-lock.json`.

* maybe something else...

~~~
scottmf
What would you gain by storing the commit in the lock file?

You can reference commits in package.json already.

~~~
feduzi
For the purpose of reproducible `node_modules` tree.

Ideally if all packages would use commits, and the installation algorithm will
never change, then there will be no need for lock files.

In reality some packages will use NPM existing mechanism, so "git-based
algorithm" will need to accommodate for that by reading git repo of the NPM
package and referring to a specific commit, which should be store in `package-
lock.json`.

------
whack
Could someone explain why dependency-management-systems don't enforce
immutable releases? Ie, package owners can publish all the new versions they
want, but they are never able to edit/remove/liberate an already-released
version. It seems like that would solve so many problems, such as the left-pad
fiasco.

------
hrpnk
Would be interesting to see the load curves on github and npm servers due to
people and servers retrying downloads.

------
prh8
As someone unfamiliar with NPM, why does it not lock package names for a
certain period of time? Rubygems has a 90 day period, so if a package is
completely removed, the name can't be used for that long. That seems like it
would help with the security side of these problems.

~~~
allover
> As someone unfamiliar with NPM, why does it not lock package names for a
> certain period of time?

From [1]:

> With the default registry (registry.npmjs.org), unpublish is only allowed
> with versions published in the last 24 hours. If you are trying to unpublish
> a version published longer ago than that, contact support@npmjs.com.

I am kinda assuming that _if_ npm support were to help you unpublish a package
that is depended upon (they might refuse), they _would_ prevent someone else
from re-publishing to that name (they might put up their own placeholder
package, like they did during the left-pad incident), but granted I can't find
this stated anywhere.

I think the reason re-publishing seemed to happen in this case was they
weren't prepared for whatever vector allowed for the deletion of these
packages.

[1]
[https://docs.npmjs.com/cli/unpublish](https://docs.npmjs.com/cli/unpublish)

------
cimnine
When will people learn to host their own private mirror for things on the web
that they depend on?

------
fold_left
this issue is avoidable by using shrinkpack:

HN: Shrinkpack – npm dependencies as tarballs, prevents “left-pad” style
breakage -
[https://news.ycombinator.com/item?id=11353908](https://news.ycombinator.com/item?id=11353908)

------
mifreewil
I made a half-joking comment on that thread that "'Bout time NPM goes
blockchain." Either someone deleted it, or GitHub lost it among all the
traffic to that issue.

Wonder if npm, Inc. would view a decentralized registry as a threat to their
business model?

------
chrisper
Didn't something similar happen last year? I think it was packages with
similar names.

~~~
clowd
Yeah, there have been multiple typo-squatting incidents.

[http://blog.npmjs.org/post/163723642530/crossenv-malware-
on-...](http://blog.npmjs.org/post/163723642530/crossenv-malware-on-the-npm-
registry)

------
sublimino
Apparently the restore is now complete:

[https://github.com/npm/registry/issues/255#issuecomment-3557...](https://github.com/npm/registry/issues/255#issuecomment-355780429)

------
andrethegiant
Can't help but wonder if committing node_modules to your repo is now a good
idea...

~~~
TeMPOraL
Only if you have a file system that automatically deduplicates everything, or
_lots of hard drives_...

------
carlchenet
I wrote "The Github Threat" about this possible issue
[https://carlchenet.com/the-github-threat/](https://carlchenet.com/the-github-
threat/)

------
ben_jones
module.exports = typeof Promise === 'function' ? Promise : require('pinkie');

I can't even install webpack-dev-server. Because this package is missing.

EDIT: it's back

[1]: [https://stackoverflow.com/questions/48131550/nodemon-
install...](https://stackoverflow.com/questions/48131550/nodemon-install-
error-no-valid-versions-available-for-timed-out)

[2]:
[https://github.com/npm/registry/issues/255](https://github.com/npm/registry/issues/255)

~~~
0x0
But it's published by "puradox", not by "floatdrop"!

~~~
clon
This is utter madness...

------
h1d
Is there a possibility that npm turn package names into "author/package"
style, so there would be less confusion on what the users are installing and
less chance of name squatting?

------
failrate
Remember to freeze your packages after installing them as a project
dependency. You should have the packages in your source tree or your own
internal package manager (local nuget, for example).

------
avinium
Is there any chance of something similar happening for Nuget? I rely on it
heavily for project dependencies, and I'd like to know if there's a ticking
timebomb there too.

------
stephenr
I don't understand why npm even _has_ the facility for a user to remove public
packages.

It defeats the entire purpose of using a public repository.

------
JepZ
Wow, they comment faster there than one can read :D

------
aabbcc1241
I'd prefer to store dependency on permanent storage service like IPFS, so it
can work even when npm or github are down.

------
bb88
So this is an NPM.js issue, I thought this was a broader issue on github
according to the headline.

------
atom-morgan
So _that 's_ why my "npm install" on Heroku randomly started failing.

------
pwaai
Is pip also vulnerable to this type of exploit? Or is this unique to npm?

------
solidsnack9000
I would say we should be signing packages but...

------
iMuzz
What exactly happened here?

------
jgalt212
So has npmjs.com been hacked or what?

------
mcguire
Again?!?

------
methyl
Is Yarn with its cache affected?

~~~
theo31
yes it is

------
pmilot
F by xv v

------
shawn
So, funny story: I registered the "nazi" npm package. When you require it, it
says "I did nazi that coming." That's it. (Though it would've been a funny
name for a linter.)

... Or it did. I received a harshly worded letter from npm saying they axed
it. It hit all the talking points about inclusiveness and making sure no one
feels even slightly annoyed.

Meh. No point to this story. Just an interesting situation with an
inconsistently curated package manager. I was surprised there was an
unofficial undocumented banlist.

~~~
shrimp_emoji
Does "stalinist" work?

------
vitaliyf
Don't. Deploy. From. Internet.

~~~
StavrosK
What's the alternative? Have the maintainer snail-mail you the packages?

~~~
vitaliyf
You run a private NPM mirror where you copy dependencies that you rely on,
after auditing them (for code quality and licensing).

~~~
StavrosK
Wouldn't just pinning the hash of a package be a better solution?

~~~
tomjakubowski
That’s probably fine from the security perspective, but the hash won’t make
the package re-appear if it disappears out of nowhere. That’s the other
benefit of a private/on-premesis mirror.

~~~
StavrosK
True. I work with PyPI and it's been extremely solid for years, so we tend to
just not consider this a problem at all. Pipenv stores hashes for each package
version as well, so you get the security aspect built in.

Pipenv has pretty much fixed Python packaging/dependencies, in my opinion.
It's the all-in-one tool I've always wanted. If you do any Python work, try
it, it's great.

