
A one-line package broke `npm create-react-app` - tessela
https://github.com/then/is-promise/issues/13
======
gav
Digging into the reason behind breakage, the change is this one:
[https://github.com/then/is-
promise/commit/feb90a40501c8ef69b...](https://github.com/then/is-
promise/commit/feb90a40501c8ef69b0c65bdf1eb703182214407#diff-b9cfc7f2cdf78a7f4b91a753d10865a2)

Which adds support for ES modules: [https://medium.com/@nodejs/announcing-
core-node-js-support-f...](https://medium.com/@nodejs/announcing-core-node-js-
support-for-ecmascript-modules-c5d6dc29b663)

However the exports syntax requires a relative url, e.g. ‘./index.mjs’ not
‘index.mjs’. The fix is here: [https://github.com/then/is-
promise/pull/15/commits/3b3ea4150...](https://github.com/then/is-
promise/pull/15/commits/3b3ea4150d9a0788ed467d0c05a7e67f5070e32a)

~~~
searchableguy
I wonder why people won't use yarn zero installs. They are great for having a
reproducible builds and can work offline. You can have a CI and git hook which
checks your code before deployment or pushing to git.

Another way is to pin down the specific versions without ~ or ^ in the
package.json so your updates don't break stuff.

~~~
jrochkind1
What's "yarn zero installs"? Googling did not do it for me.

~~~
acemarke
That might be referring to Yarn's "offline mirror" feature. When enabled, Yarn
will cache package tarballs in the designated folder so that you can commit
them to the repo. When someone else clones the repo and runs `yarn`, it will
look in the offline mirror folder first, and assuming it finds packages
matching the lockfile, use those.

This takes up _far_ less space than trying to commit your `node_modules`
folder, and also works better cross-platform.

I wrote a blog post about setting up an offline mirror cache a couple years
ago:

[https://blog.isquaredsoftware.com/2017/07/practical-redux-
pa...](https://blog.isquaredsoftware.com/2017/07/practical-redux-
part-9-managing-dependencies/#managing-dependency-packages-for-offline-
installation)

Used it on my last couple projects at work, and it worked out quite well for
us.

~~~
freedomben
That's quite interesting, although back in the day we did that for C
dependencies that weren't packaged well, and it quickly ballooned the size of
our repo since git has to treat tar balls as binaries. Even if you only update
a few lines of the dependency for a patch version, you re-commit the entire 43
MB tarball (obviously that depends on the size of your tarball).

~~~
curryst
You could use Git LFS to store anything ending with a tarball extension. It's
pretty well supported by most Git servers (I know GitHub and GitLab support it
off the top of my head). You do need the LFS extension for Git to use it.

------
throwanem
The problems that beset the Javascript ecosystem today are the same problems
that beset the Unix ecosystem, back in the 90s when there still was one of
those. TC39 plays the role now that OSF did then, standardizing good ideas and
seeing them rolled out. That's why Promise is core now. But that process takes
a long time and solutions from the "rough consensus and running code" period
stick around, which is why instanceof Promise isn't enough of a test for
things whose provenance you don't control.

Of course, such a situation can't last forever. If the idea is good enough,
eventually someone will come along and, as Linux did to Unix, kill the parent
and hollow out its corpse for a puppet, leaving the vestiges of the former
ecosystem to carve out whatever insignificant niche they can. Now the major
locus of incompatibility in the "Unix" world is in the differences between
various distributions, and what of that isn't solved by distro packagers will
be finally put to rest when systemd-packaged ships in 2024 amid a flurry of
hot takes about the dangers of monoculture.

Bringing it back at last to the subject at hand, Deno appears to be trying to
become the Linux of Javascript, through the innovative method of abandoning
the concept of "package" entirely and just running code straight from wherever
on the Internet it happens to live today. As a former-life devotee of Stack
Overflow, I of course applaud this plan, and wish them all the luck they're
certainly going to need.

The impetus behind "lol javascript trash amirite" channer takes today is
exactly that behind the UNIX-Haters Handbook of yore. I have a printed copy of
that, and it's still a fun occasional read. But those who enjoy "javascript
trash lol" may do well to remember the Handbook authors' stated goal of
burying worse-is-better Unix in favor of the even then senescent right-thing
also-rans they favored, and to reflect on how well that played out for them.

~~~
morelisp
This analogy doesn't hold up at all.

The UHH is a fun read, yes, but the biggest real-world problem with the Unix
Wars was cross-compatibility. Your Sun code didn't run on Irix didn't run on
BSD and god help you if a customer wanted Xenix. OK, you can draw some
parallel here between React vs. Vue vs. Zeit vs. whatever.

But there was also the possibility, for non-software businesses, to pick a
platform and stick to it. You run Sun, buy Sun machines, etc. That it was
"Unix" didn't matter except to the software business selling you stuff, or
what kind of timelines your in-house developers gave.

There is no equivalent in the JS world. If you pick React, you're not getting
hurt because Vue and React are incompatible, you're getting hurt because the
React shit breaks and churns. Every JavaScript community and subcommunity has
the same problem, they keep punching _themselves_ in the face, for reasons
entirely unrelated to what their "competitors" are doing. Part of this is
because the substrate itself is not good at all (way worse than Unix), part is
community norms, and part is the piles of VC money that caused people to hop
jobs and start greenfield projects every three months for 10 years rather than
face any consequences of technical decisions.

Whatever eventually hollows out the mess of JS tech will be whatever figures
out how to offer a stable developer experience across multiple years without
ossifying. (And it can't also happen until the free money is gone, which maybe
has finally come.)

~~~
gavinray
This is slightly off-tangent, but as someone who has written production
software on the front-end (small part of what I do/have done) in:

Vanilla -> jQuery -> Angular.js -> Angular 2+, React pre-Redux existence ->
modern React -> Vue (and hobby apps in Svelte + bunch of random stuff:
Mithril, Hyperapp, etc)

I have something to say on the topic of:

> _" If you pick React, you're not getting hurt because Vue and React are
> incompatible, you're getting hurt because the React shit breaks and
> churns."_

I find the fact that front-end has a fragmented ecosystem due to different
frameworks completely absurd. We have Webcomponents, which are framework-
agnostic and will run in vanilla JS/HTML and nobody bothers to use them.

Most frameworks support compiling components to Webcomponents out-of-the-box
(React excepted, big surprise).

[https://angular.io/guide/elements](https://angular.io/guide/elements)

[https://cli.vuejs.org/guide/build-targets.html#web-
component](https://cli.vuejs.org/guide/build-targets.html#web-component)

[https://svelte.dev/docs#Custom_element_API](https://svelte.dev/docs#Custom_element_API)

If you are the author of a major UI component (or library of components), why
would you purposefully choose to restrict your package to your framework's
ecosystem. The amount of work it takes to publish a component that works in a
static index.html page with your UI component loaded through a <script> tag is
trivial for most frameworks.

I can't tell people how to live their lives, and not to be a choosy beggar,
but if you build great tooling, don't you want as many people to be able to
use it as possible?

Frameworks don't have to be a limiting factor, we have a spec for agnostic UI
components that are interoperable, just nobody bothers to use them and it's
infuriating.

You shouldn't have to hope that the person who built the best "Component for
X" did it your framework-of-choice (which will probably not be around in 2-3
years anyways, or have changed so much it doesn't run anymore unless updated)

\---

Footnote: The Ionic team built a framework for the singular purpose of making
framework-agnostic UI elements that work with everything, and it's actually
pretty cool. It's primarily used for design systems in larger organizations
and cross-framework components. They list Apple, Microsoft, and Amazon as some
of the people using it in production:

[https://stenciljs.com/](https://stenciljs.com/)

~~~
throwanem
Web components aren't really there yet. They will be two or three years from
now. Some time between now and then, I expect React will gain the ability to
compile down to them, which shouldn't be too hard since web components are
pretty much what happens when the React model gets pulled into core.

~~~
gavinray
You can compile React to Webcomponents with community tooling, the core
framework just doesn't support them:

[https://github.com/adobe/react-webcomponent](https://github.com/adobe/react-
webcomponent)

By "aren't really there yet", what do you mean? If you mean in a sense of
public adoption and awareness, totally agree.

If you mean that they don't work properly, heartily disagree. They function
just as well as custom components in any framework, without the problem of
being vendor-locked.

You may not be able to dig in to the internals of the component as well as you
would a custom build one in your framework-of-choice, but that's largely the
same as using any pre-built UI component. You get access to whatever API the
author decides to surface for interacting with it.

A properly built Webcomponent is generally indistinguishable from consuming
any other pre-built UI component in any other framework (Ionic built a multi-
million dollar business of off this alone, and a purpose-built framework for
it).

------
erulabs
I'm a developer, but I'm also on-call 24/7 for a Node.js application. The
number of people here saying "this is why you don't use dependencies" or "this
is why you vendor your deps" is frustrating to see. No one _but no one_ who
has managed complex enough systems will jump on the bandwagon of enterprise-
ready, monolithic and supported over something like Node.js. I'd trade in my
JavaScript for J2EE about as fast as I'd quit tech and move up into the
mountains.

There are trade-offs, absolutely. Waiting on a vendor to fix a problem _for
months_, while sending them hefty checks, is far inferior to waiting 3 hours
on a Saturday for a fix, where the actual issue only effects new installations
of a CLI tool used by developers, and can trivial be sidestepped. If anything,
it's a chance to teach my developers about dep management!

I'm positive my stack includes `is-promise` about 10 times. And I have no
problem with that. If you upgrade deps (or don't) in any language, and don't
have robust testing in place, the sysadmin in me hates you - I've seen it in
everything from Go to PHP. There is no silver bullet except pragmatism!

~~~
sosodev
There’s no silver bullet you’re absolutely right, but does that mean there
isn’t room for improvement? Or that you shouldn’t try? Dropping all
dependencies is extreme for sure but to argue against something as simple as
vendoring is a bit odd.

~~~
erulabs
You’re correct - there is room for improvement. The “npx” tool is a easy place
to start! And absolutely agreed dropping dependencies is extreme and vendoring
not so much - but in my experience vendoring often means “don’t ever touch
again until a bad security shows up”. I was being a little bit too snarky in
my comment tho, absolutely :)

------
chvid
And the source code of the library is:

    
    
       function isPromise(obj) {
         return !!obj && (typeof obj === 'object' || typeof obj === 'function') && typeof obj.then === 'function';
       }

~~~
staticassertion
Here's my off-the-cuff take that will not be popular.

A function like this _should be a package_. Or, really, part of standard js,
maybe.

A) The problem it solves is real. It's dumb, but JS has tons of dumb stuff, so
that changes nothing. Sometimes you want to know "is this thing a promise",
and that's not trivial (for reasons).

B) The problem it solves is not straightforward. If you Google around you'll
get people saying "Anything with a .then is a promise' or other different ways
of testing it. The code being convoluted shows that.

Should this problem be solved elsewhere? Sure, again, JavaScript is bad and no
one's on the other side of that argument, but it's what we have. Is "just copy
paste a wrong answer from SO and end up with 50 different functions in your
codebase to check something" like other languages that make package management
hard _so much better_ ? I don't think so.

~~~
alerighi
Lile...

    
    
        x instanceof Promise 
    

It works for standard promises, sure there are non standard promises, ancient
stuff, that to me shouldn't be used (and a library that uses them should be
avoided). So why you need that code in the first place?

Also that isPromise function will not work with TypeScript, imagine you have a
function that takes something that can be a promise or not (and this is also
bad design in the first place), but then you want to check if the argument is
a Promise, sure with `instanceof` the compiler knows that you are doing,
otherwise not.

Also, look at the repo, a ton of files for a 1 line function? Really? You take
less time to write that function yourself than to include that library. But
you shouldn't have to write that function in the first place.

~~~
exogen
Your implementation is broken even if everything uses native Promises. I don't
know how many times this exact thread needs to happen on HN (as it has many
times before) until people realize their "no duh" implementations of things
are actually worse than the thing they're criticizing.

Make an iframe.

In the iframe:

    
    
        > window.p = new Promise(() => {});
    

From the parent window:

    
    
        > window.frames[0].p instanceof Promise
        false
    

Congrats! Your isPromise function was given a Promise and returned the
incorrect result. The library returns the correct result. Try again!

~~~
stagas
No, it was not given a Promise. It was given a foreign object from another
window. If you want to inspect another window you should not be reusing code
that is designed for single threaded operations. Instead, have a layer that
translates, serializes, or explicitly defines an interface that the objects we
are dealing with are foreign and need to be transformed. Then the abstraction
implementation details of dealing with multiple windows become a concern of a
single layer and not your entire codebase. Implicitly and magically treating a
foreign window as this window, will fail in many subtle and unknown ways. The
"brokenness" you mention is not in that implementation, it is correctly
breaking, telling you that what you are doing is wrong, then you try to bypass
the error instead of fixing your approach.

~~~
foepys
GP's comment screams XY problem which seem to be increasingly common these
days.

~~~
exogen
If you think pointing out a bug due to an edge case someone didn't think of is
the XY problem, I'm afraid you don't know what the XY problem is.

~~~
foepys
The problem was to get the promise out of the iframe when you shouldn't do
this directly in the first place.

This literally is an XY problem: "I need to do A but it's giving me bad
results, what do I need to add?" \- "Don't use A, it's bad practice. Use B
instead and keep using built-in tools instead of hacking something together"
In this case use instanceof instead of is-promise because it's a hack around
the actual problem of getting objects out of a different context that was
_explicitly designed_ to behave this way.

I'm afraid that _you_ don't know what an XY problem is.

JavaScript developers always seem to think they are the smart ones after their
6 weeks of some random bootcamp and then you end up with some crap like NPM
where a single line in a package out of hundreds maintained by amateurs can
break everybody's development environment.

------
habosa
I am one of the maintainers of a popular Node-based CLI (the firebase CLI).
This type of thing has happened to us before.

I think the real evil here is that by default npm does not encourage pinned
dependency versions.

If I npm install is-promise I'll get something like "^1.2.1" in my
package.json not the exact "1.2.1". This means that the next time someone
installs my CLI I don't know exactly what code they're getting (unless I
shrinkwrap which is uncommon).

In other stacks having your dependency versions float around is considered bad
practice. If I want to go from depending on 1.2.1 to 1.2.2 there should be a
commit in my history showing when I did it and that my CI still passed.

I think we miss the forest for the trees when we get mad about Node devs
taking small dependencies. If they had pinned their version it would have been
fine.

~~~
KMnO4
That’s still the fault of the package developer. “^1.2.1” means “any version
with a public API compatible with 1.2.1”, or in other words “only minor
versions”.

The whole point of semantic versioning is to guarantee breaking changes are
expressed through major versions. If you break your package’s compatibility
and bump the version to 1.2.1 instead of 2.0.0 then people absolutely should
be upset.

~~~
lukevp
Allowing any version drift of dependencies at all means that if you don’t
check in and restore using the package lock file, you cannot have reproducible
builds. The package lock files are themselves dependent on which package
restore tool you are using (yarn vs npm vs ...) it’s also much too ambitious
to believe that all packages in an ecosystem will properly implement semver.
There may even be times where a change doesn’t appear to be breaking to the
maintainer but is in actuality. For example, suppose a UI library has a css
class called card-invalid-data and wanted to rename to card-data-invalid. This
is an internal change since it is their own css, but could break a library
that overrode this style or depended on this class. I would consider this a
minor version but it could still cause a regression for someone.

~~~
rroblak
> Allowing any version drift of dependencies at all means that if you don’t
> check in and restore using the package lock file, you cannot have
> reproducible builds.

This is the germane point in this incident.

The parent comment mentions that SemVer "guarantee[s] breaking changes are
expressed through major versions". This is a common misperception about
SemVer. That "guarantee" is purely hypothetical and doesn't apply to the real
world where humans make mistakes.

The OP `is-promise` issue is an example of the real world intruding on this
guarantee. The maintainers clearly didn't intend to break things but they did
because everybody makes mistakes

Which points to the actual value proposition of SemVer: by obeying these
rules, consumers of your package will know your _intention_ with a particular
changeset. If the actual behavior of that changeset deviates from the SemVer
guidelines (e.g. breaking behavior in a patch bump), then it's a bug and
should be fixed accordingly.

Back to the parent's point about locking dependency version— I would add that
you should also store a copy of your dependencies in a safe location that you
control (aka vendoring) if anything serious depends upon your application
being continually up and running.

------
odensc
I feel the real issue here is downstream package consumers not practicing
proper dependency pinning. You can blame the Node ecosystem, the maintainer of
the package, etc. but there are well-known solutions to prevent this kind of
situation.

~~~
thawkins
So you would exchange security for stability, if you use package pinning then
you will end up with fosilized packages in your product, which will have all
maner of security issues that have alresdy been fixed.

~~~
rhizome
I get notifications to update my Rails apps from GitHub as a matter of course
when there's a CVE in my dependencies. Does this kind of thing not exist/is
impractical for JS?

~~~
hobofan
From my experience of getting ~30 of those notifications per week for a
handful of JS repos, I can very much assure you that it does exist.

------
jackewiehose
I think these one-line-packages aren't the right way to go. Either JS-
developers should skip the package-system in that case and just copy and paste
those functions into their own project or there should be more common used
packages that bundle these one-liners. I mean is_promise() and left_pad() are
not worth their own package. Packages-dependencies of 10000 packages for
trivial programs are just insane.

Is someone going to fix that?

~~~
krapp
>Is someone going to fix that?

Probably not. There is too much code in the wild, and NPM owns the entire JS
ecosystem, and there has been too much investment in that ecosystem and its
culture at this point for a change in course to be feasible.

The JS universe is stuck with this for the foreseeable future.

~~~
jackewiehose
Does it need much to change? I didn't mean to fix NPM. The problem is the non-
existing standard-library. Just create one that everybody will use and
everybody could cut their dependencies by thousands.

~~~
krapp
Not everyone would use it, that's my point. The inertia behind the existing
system is too great, especially in enterprise. All that would happen is that
library would become just another Node package, and then you've got the "n+1
standards" problem.

The "nonexistent standard library" wasn't a problem in the days when
javascript development meant getting JQuery and some plugins, or some similar
library. It only became a problem after the ecosystem got taken over by a set
of programming paradigms that make no sense for the language.

Yes, in my mind you'd have to change everything from the ground up, starting
with no longer using javascript outside of the browser.

~~~
jackewiehose
> Not everyone would use it

If the right people would provide the library, it would be used by enough
people.

> Yes, in my mind you'd have to change everything from the ground up, starting
> with no longer using javascript outside of the browser

Whats the point of inside or outside of the browser?

~~~
krapp
The point is that different languages are best suited to different tasks.
Javascript is a simple, very loosely typed scripting language with prototypal
inheritance that was developed to be run in the browser. It's a DSL, not a
general purpose programming language. Using it elsewhere for applications
where another language with stronger and more expressive types would be more
appropriate requires hacks like compiling it from another (safer, more
strongly typed) language like Typescript, which still results in code that can
be fragile because it only simulates (to the degree that a JS interpreter
allows) features that the language doesn't actually support.

See the attempt to "detect if something is a Promise" as an example - the
function definition for the package makes it appear as if you're actually
checking a type, but that's not what the package does.

Most of the unnecessary complexity in modern JS, as I see it, comes from the
desire to have it act and behave like a language that it simply isn't.

~~~
jackewiehose
> It's a DSL, not a general purpose programming language

Sorry, but I fear that ship has sailed ;-)

And I've heard JS was developed by someone who wanted to give us Scheme (you
can't go more general purpose than that) but had to resort to a more
"friendly" java-syntax. IMHO javascript would be a great general purpose
language if the ecosystem wouldn't be such a mess.

~~~
krapp
>Sorry, but I fear that ship has sailed ;-)

I know, I know. If anyone needs me I'll be in the angry dome.

------
c-smile
Weinberg's Law: If Builders Built Buildings the Way Programmers Wrote
Programs, Then the First Woodpecker That Came Along Would Destroy
Civilization.

~~~
Ididntdothis
This sounds very clever but the nature of software development is quite
different from building buildings. The rate of innovation is by magnitudes
higher. And as opposed to buildings software can tolerate a certain amount of
failure.

~~~
rhizome
"Pfft. Chickens don't even know what a road _is_!"

------
montroser
Everyone crying about this on the Internet would do better to just take it as
an easy lesson: pin your dependency versions for projects running in
production.

This was an honest oversight, and even somewhat inevitable with so many
expected supported ways to import/export between cjs mjs amd umd etc. It will
happen again.

And when it happens the next time, if it ruins your life again, take issue
with yourself for not pinning your dependency versions, rather that package
maintainers trying to make it all happen.

~~~
btilly
And everyone who depends on projects that pin their dependency versions gets
to be victims of security exploits long after they are fixed.

Dependency management is not as simple as you seem to think.

~~~
jessaustin
The "magical security updates" theory has never worked. Breaking
insufficiently-pinned dependencies are vastly more common than unnoticed fixes
on patch releases. On balance, semver has been good for javascript, but to the
extent it contributed to the popularization of this dumb theory it has been
bad. Production apps (and by a transitive relation, one supposes, library
modules) should be zealously pinned to the fewest possible dependencies, and
those dependencies should be regularly monitored for updates. When those
updates occur, tests can be run before updating the pins.

~~~
montroser
Yes -- pinning dependency versions does not have to be at odds with security.

In fact, how secure is it, really, to keep dependencies unpinned and welcome
literally /any/ random upstream code into your project, unchecked? This is yet
more irresponsible than letting dependencies age.

But even then, it's not as if you have to choose -- you can pin, then vet
upstream updates when they come, and pin again.

------
choward
Can someone help me understand why a library like this is even necessary?
Can't you just wrap everything and treat it like a promise?

    
    
        const aPromise = Promise.resolve(1);
        const notAPromise = 2;
    
        Promise.resolve(aPromise).then((x) => console.log(x));
        Promise.resolve(notAPromise).then((y) => console.log(y));
    
        // Logs:
        // 1
        // 2

~~~
turnipla
This is not a library. Stop thinking of it as a library. It's a building
block, a module.

> why a library like this is even necessary?

Do you know how to determine whether something is a Promise?

Wrong. Also the first few StackOverflow answers are wrong or incomplete.

You know what's better? Using the same library 3.4 million repos depend on,
that is tested and won't break if you use a package-lock.

> Can't you just wrap everything and treat it like a promise?

Maybe. Maybe not. Treating everything as a Promise means you have to make your
function asynchronous even if not necessary.

------
gitgud
Dependencies in almost any software system are fundamentally built on trust.

You trust that minor version upgrades won't break the system, or that
malicious code won't be introduced. But we're human... things break.

This can happen in any ecosystem, but npm is particularly vulnerable because
of it's huge dependency trees. Which is only possible due to the low overhead
of creating, including and resolving packages.

That's why npm has the "package-lock" file, which takes a snapshot of the
entire dependency tree, allowing a truly reproducible build. Not using this is
a risk.

------
rubyn00bie
Call me crazy, but... I don't add things to my projects without looking at the
source. Mostly because it saves me from shit like this. If I see something is
small enough, and easy enough to reason about, I'll just copy-pasta that
motherfucker with a comment citing the source and date it was pasta'd (license
permitting).

Things like this are so not worth a package, ever, it's something when you see
it you go "oh yeah, that's the obvious, easy way of doing this" it's not a
package, it's a pattern. I can _promise_ you, this was only ever added to
packages because people wrongly assumed because since it's about "promises"
(spooooky) it must be complex and worthy of packaging.

As someone who doesn't do front-end work regularly, but also sank about 3
consecutive weeks (~6-8 hours/day) in the last year into understanding
generators, yielding, and promises... I can tell you, the actually scary part
about all of this, is pretty much no one just reads the fucking docs or the
code they're adding.

Moral of the story, especially in the browser: the reward of reading the code
before adding it is enormous, you'd be surprised how often the thing you want
is just a simple pattern. Taking that pattern and applying it to your specific
use case, instead of imposing that pattern on your use case will give you
giant wins.... Learn the patterns and you're set for life.

~~~
bdcravens
create-react-app contains over 1000 packages. How long would it take to review
all of those?

~~~
metroholografix
This sort of cavalier attitude where 1000 dependencies are yawn-inducing in
their commonality is why I feel vindicated in never having wasted my time with
this kind of ecosystem. Eventually the house of cards will come down. Let's
all pray that it happens sooner rather than later.

~~~
karatestomp
When WebAssembly gets direct DOM access we'll finally have options and we'll
no longer _have to_ tolerate JavaScript. I expect the JS community will settle
down and get somewhat saner after that, too.

------
brown9-2
There is something aggravating about the first comment in an issue like this
posted minutes after the issue was created to say “is this fixed yet?”

~~~
ehsankia
The worst part is that it was posted 10 minute later... I understand bumping a
year old issue, but cmon give it a few minutes at least.

~~~
luckylion
It was posted after _chore: fix is-promise_ , they may have not realized that
it's just some random pingback that github shows there, and not a message that
somebody has committed a fix.

------
andrew_
It's downloaded 11 million times a week. This touches a good majority of the
Node ecosystem, so there's going to be quite a lot that doesn't work until
this is remedied. And I'm not sure that package-lock.json is going to save
folks here because it was a minor version update.

[https://www.npmjs.com/package/is-promise](https://www.npmjs.com/package/is-
promise)

~~~
roonyh
package-locks lock down exact versions. Otherwise there's no point to them.

------
russtrotter
is it idiomatic in the JS world to always express dependencies in the "version
X.Y or higher", vs "version X.Y"? Most of my experience is from the java/maven
world where you're playing with fire if you don't just make it "X.Y".

~~~
throwanem
There are a lot of idioms. A very common one, I think the current default, is
to pin only the major version in the dependency list, and also to lock exact
versions in an installer-generated lockfile following a successful install. If
you find a locked version breaks your code, you adjust your dependency list,
nuke the lockfile, and let a reinstall build it again.

The idea is that pinning major versions lets you get non-breaking improvements
from package authors who use semver properly, and pinning exact known-good
versions lets you avoid surprises in your CI builds.

It works pretty well when you start from a known good state and vet your
dependencies reasonably well. The trouble here seems to be largely that CRA is
designed, among other purposes, to serve people just getting into the
ecosystem of which it's a part, and those people are unlikely to be familiar
enough with the details I've described to be able to effectively respond.

The comparison with left-pad is easy, but this isn't at all on the same scale.
It's a bad day for newbies and a minor annoyance for experienced hands. And,
of course, cause for endless spicy takes about how Javascript is awful, but
such things are as inevitable as the sunrise and merit about the same level of
interest.

------
cryptoquick
Does anyone know if there's a way to upgrade a dependency of a dependency of a
dependency of a dependency of a dependency in my yarn.lock without actually
editing the yarn.lock, and also, waiting for five packages to update their
dependencies, especially if they're locked or specified by even one of the
five in semver rules?

For example:

Running `yarn why is-promise` in a CRA app:

`Hoisted from "react-scripts#react-dev-utils#inquirer#run-async#is-promise"`

Currently, running a `yarn upgrade-interactive --latest` doesn't indicate
there are any updates, so presumably, this is still a problem upstream.

Also, if anyone's in a pinch right now, luckily enough, I made this yesterday,
for an interview I had only a couple hours ago. I lucked out! But if anyone
else might need it, maybe it'll help someone:

[https://github.com/cryptoquick/demo-cra-
ts](https://github.com/cryptoquick/demo-cra-ts)

Oh, and, uh, pardon the pun... :/

~~~
acemarke
Per other comments in the thread, this is the primary use case for Yarn's
"resolutions" feature:

[https://classic.yarnpkg.com/en/docs/selective-version-
resolu...](https://classic.yarnpkg.com/en/docs/selective-version-resolutions/)

~~~
cryptoquick
Thanks! Tried, it seems to work! Repo updated, too. This seems to be what it
does, for those curious: [https://github.com/cryptoquick/demo-cra-
ts/commit/5c84aa48e9...](https://github.com/cryptoquick/demo-cra-
ts/commit/5c84aa48e9409478a6b19e2f065fa99313d6b10f)

------
antibland
One reason why we may have one line packages is the demand FAANG places on
applicants. On three job screens from three different companies, I was asked,
"Which npm packages have you created that we'd know about?"

For many who are hell-bent on entering these companies, yet have no known
packages under their belt, they very well might fire off a one line package
that actually gets some downloads, to be better "prepared" when screened.

------
biglost
A package just for:

return !!obj && (typeof obj === 'object' || typeof obj === 'function') &&
typeof obj.then === 'function';

[https://github.com/then/is-
promise/blob/master/index.js](https://github.com/then/is-
promise/blob/master/index.js)

This is insane

------
tholman
Hmm, I had some problems with a nested `is-object` dependency when updating
some deps last week, I wonder if it’s related.

------
bambataa
create-react-app was broken a couple of weeks ago when I tried to use the
Typescript template. Some dependency in Jest had been changed to require a
version of Typescript that had only been out for a few weeks, breaking
everything (including create-react-app) that hadn't updated to the latest tsc.
What an ecosystem.

------
CapriciousCptl
Someone somewhere is going to organize a series of JS packages with the
continuity of a standard library and full of verification and tests. Sane
versioning, consistent interfaces and so on. The npm ecosystem isn't bad it's
just unwieldy successful.

------
adrianhel
There should be a Promise.isThenable or Promise.is. Strictly speaking this
library checks if a value is a thenable.

With optional chaining I would however use this check:

    
    
        typeof x?.then === 'function'
    

Or if I was code golfin:

    
    
        x?.then?.call
    

The first case does not account for built in prototype extensions and the
second has false positives with certain data structures.

So the function in is-promise should be available as Promise.isThenable or
Promise.is.

------
SSchick
Probably also broke eslint too since `is-promise` is also a sub-dependency of
that as well.

This is much less of an issue when using a lockfile, at least for existing
packages/projects.

------
ahupp
There's a neat tool called crater ([https://github.com/rust-
lang/crater](https://github.com/rust-lang/crater)) for Rust. It can run an
experiment across every Rust package (or every popular one), so you can e.g
see if some theoretically breaking compiler change actually hits anyone.
Something like that could be interesting for node packages as well.

------
dstaley
This is a prime example of where I think GitHub's acquisition of Dependabot
and npm could really pay off. Imagine being able to publish a prerelease
version of your library and run the CI tests of your consumers, all from
within the GitHub interface. Dependabot already tracks compatibility between
versions, so this would be a natural extension of that.

------
kingbirdy
The the broken package just released a fix: [https://github.com/then/is-
promise/releases/tag/2.2.1](https://github.com/then/is-
promise/releases/tag/2.2.1)

------
nerdycap007
I had an issue while running any command of vue-cli.. and then I even created
an issue for it thinking that it might be a bug in the Vue CLI v4.3.1 But I
think the truth has shown itself!

------
tablethnuser
This seems like an issue with semver. Its idealism is not compatible with
actual human behavior.

The package devs clearly violated semver guidelines and npm puts a lot of
faith in individual packages to take semver seriously. By default it opts
every user into semver.

If you need semver to be explained to you bottom up (lists of 42 things that
require a major bump) then you don't get semver. All you have to do is think:
will releasing this into a world full of "^1.0.0" break everyone's shit?

This and left-pad are extreme examples. But any maintainer with a package.json
who tries to do right by `npm audit` knows that there is an endless parade of
suffering at the hands of semver misuse. Most of it doesn't make the news.

------
hota_mazi
Everywhere around the world, users of Maven Central facepalm.

------
escot
Anyone know of a way that library maintainers can automatically test if
changes like this will break consumers?

------
ng12
Looks like it broke @angular/cli too.

~~~
Noumenon72
Who does this affect? I just did

    
    
        npx @angular/cli new hello-world-project

and that worked. I have remote Angular training on Monday and didn't want to
do a global install.

~~~
ng12
It's already been patched with version 2.2.1.

------
xyst
Now I understand the reasoning behind the Golang suggestion to commit the
/vendor directory.

------
ashtonkem
The fact that this has broken serverless for people is reinforcing my priors
in a big way.

------
keithnoizu
meanwhile I can regularly override even major release versions of dependencies
in elixir with out breaking changes. dependency fickleness has always been a
huge issue for me when working with node.

------
sneak
I hope that more packaging systems take the go modules approach and
cryptographically and immutably identify their dependencies at time of
addition to the project. This sort of breakage shouldn’t be possible.

~~~
dnautics
I'm sorry if you were unaware, but they absolutely do that, and were doing
that long before go was.

~~~
sneak
I am aware of package lockfiles.

If deps are immutable, then nothing anyone does in any other package (short of
having the package repository take the code down) should be able to break your
future builds.

If that were true, TFA would not be news.

~~~
n_e
> If deps are immutable, then nothing anyone does in any other package (short
> of having the package repository take the code down) should be able to break
> your future builds.

They are. You're only affected if you don't use a package-lock.json or start a
new project (which will pull the latest versions of the dependencies).

------
arrty88
Maybe it should be easier to do npm install latest-1

------
dgellow
As always: vendor your dependencies.

~~~
turnipla
Good luck vendoring node_modules.

Why are these threads filled with people who know nothing about node?

npm and yarn both have lockfiles for this purpose. Vendoring only bloats your
repos.

~~~
dgellow
> Why are these threads filled with people who know nothing about node?

That’s a quite bad assumption from your part based on almost no information.

I don’t know about the rest of the thread but I’m personally quite familiar
with node. A lock file doesn’t fix the same issues vendoring does. The lock
file gives you an explicit list of version used, vendoring save the exact
copies of the dependency with the rest of your code.

By vendoring anyone who is working on the project is using the exact same
version of a dependency, AND you don’t have to care about an external provider
(the registry being up, etc, that’s way easier for you CI too), AND you can
review dependencies upgrade via git as if it was your code.

Of course that’s a mess when the JavaScript ecosystem has an infinite amount
of dependencies for a hello world.

------
janpot
every package can be a one line package if you minify it. lines of code as a
metric for code quality is always relative. The fact that this is a one line
package has nothing to do with the outcome. a one-line code change in a 5000
line dependency could just as much have messed up create-react-app. The size
is irrelevant.

~~~
turnipla
This is correct. Many tools are split into multiple packages for — hear me out
— convenience.

I regularly extract features from my apps into new npm packages. This way they
can be reused by other apps.

Troglodytes can keep copy-pasting code between apps while npm users publish
once and update everywhere.

------
worik
Why NPM? I think that is the real question I do not see anybody asking. SO I
will...

Why NPM?? What is the point?

~~~
worik
Right.

No point at all.

A waste of time and a yawning security hole

------
jrockway
A little copying is better than a little dependency.

------
unwoundmouse
nvm

~~~
searchableguy
Read the linked post.

~~~
unwoundmouse
FUCK I thought the linked article described the situation

~~~
tomphoolery
way to read the article before commenting

------
zulgan
Chill with the js hate, this happens everywhere.

Maybe not to this extend, but if X (where X is whatever you are thinking
about) had similar amount of people using it (especially junior people) this
would happen there as well.

~~~
andrewzah
No.

Other languages don't publish/import packages that are _one line of code_. I
have never seen an issue like this with any other language that I've worked
with.

Any sane developer that needed a one-liner like this would just manually
implement it.

Not to mention that these sorts of functions are unnecessary in languages with
a good stdlib or statically typed languages like rust, etc.

~~~
exogen
Know what happens every time people like you say this here on HN? They post
the one-liner they would have manually implemented in their code base _and it
's wrong_. The one that comes to mind is the "is-negative-number" package.
Yes, the geniuses of Hacker News, after finding out there was an npm package
for determining whether something was a negative number, _could not correctly
implement that function._

You and everyone here are not as clever as you think you are. This is why
people prefer known-good implementations. The maintainer here did a bad
release, big fucking deal.

~~~
vel0city
Maybe its a failure of the language when it takes a third party package to
determine if a number is greater than or less than zero?

~~~
krapp
> Maybe its a failure of the language when it takes a third party package to
> determine if a number is greater than or less than zero?

It's not a failure of the language. Javascript has comparison operators like
every other language, it's entirely possible to determine if a number is
greater than or less than zero without importing a third-party package.

What it is is a failure of modern JS development culture, because apparently
it's anathema to even write a simple expression on your own rather than import
a dependency tree of arbitrary depth and complexity and call a function that
_does the same thing._

------
TechBro8615
The package referred to in the clickbait title is `is-promise`

~~~
ajross
I don't know how "clickbait" that title can be when it is, in fact, _longer_
than the line of code in question:

    
    
        declare function isPromise<T, S>(obj: Promise<T> | S): obj is Promise<T>;
    

This is, indeed, the only line of exported code in the entire package.

I genuinely don't understand the NPM world.

~~~
redmorphium
Me neither. I can't wait for Deno 1.0 next month.

[https://deno.land/](https://deno.land/)

~~~
freeqaz
What is this exactly? The website is a bit unclear.

~~~
jpangs88
I was wrong about what this was I have edited this comment

~~~
eropple
Why would a Rust wrapper around the C++ project that is V8, which implements a
garbage-collected programming language and environment, "use less ram" just by
virtue of some parts of it being written in Rust?

------
0xff00ffee
This is why regression suites are important.

EDIT: I wasn't dissing the developers. They have regression, this was just an
accident. I was stating it is important. My bad (too late to delete).

~~~
nemetroid
Could create-react-app have avoided this through regression suites?

~~~
0xff00ffee
Bumping your comment because I would like to know. I'm following the github
thread.

~~~
jakear
Potentially. If cra had pinned all their deps, and used a bot to automatically
bump deps contingent on passing a comprehensive regression matrix, this would
have been avoided. GitHub's Dependabot is good for this. In my opinion
everybody besides libraries should pin deps and use dependabot.

~~~
gombosg
Exactly. We use Renovatebot for the same purpose. It pins dependencies and
creates PRs for updates. Amazing to see how often the builds break, even
sometimes after minor updates. But at least we fix them before release, and
not after... :)

~~~
jakear
Yep. One of the very nice things about npm/node versus python or go or some
others is that package locks and dependency pinning is possible. But few
people seem to use it.

I’ve seen reports of people using a go library that gets a minor update and
breaks their app, at which point they become SOL as go always installs the lad
test version. I myself have been working in python projects where the
dockerfile simply says “pip install blah” and I get different deps than the
working version. No clue why anyone would be okay with working like that.

~~~
jen20
It's not true that Go always installs the latest version of a dependency. `go
get github.com/x/y@v1.3.4` installs v1.3.4 of x/y, assuming there is a tag
matching that.

~~~
jakear
I’m not familiar with go, would this persist to other people attempting to
install the package?

The issue I’ve seen is:

[https://github.com/go-yaml/yaml/issues/558](https://github.com/go-
yaml/yaml/issues/558)

> Please do follow semver as it's a nightmare for us to manage particularly
> using go module (you can't stick to a particular version).

And of course everybody’s idea of a breaking change is different, so this idea
that you can’t install a particular version seems unworkable.

------
esaym
You had one job...

------
ConcernedCoder
mrw you're using a package to do a type-check:

return !!obj && (typeof obj === 'object' || typeof obj === 'function') &&
typeof obj.then === 'function'

------
jbverschoor
Is this news? Happened so many times before. NPM is broken. Yarn 2.0 is never
gonna take off. These problems have been fixed long before. Waste of life.

