Hacker News new | past | comments | ask | show | jobs | submit login
Pika: Making it easier to find, publish, install, and use modern packages on npm (pikapkg.com)
123 points by robin_reala 20 days ago | hide | past | web | favorite | 59 comments

This is really cool - I am a big fan of anything that lets me use less build tools, dot files, and transpilation steps in my side projects. Thanks to native modules I have that mostly down to 0: back to just a bunch of html, css, and javascript files again!

But I usually have to do some re-exporting shenanigans for the one or two third-party library I really need. This is a great step away from that - thank you!

While I'm a big fan of native modules, I find the constant use of the adjective "modern" in new JS project descriptions somewhat disturbing. Being modern in itself is not of any value. If it loads faster, or if it obviates the need for a bundler, or if it makes granular upgrades easier, or if it reduces the amount of configurations or tooling, we should put those benefits first.

Ironically I feel like the entire process node/npm is labouring through is anything but modern. None of these problems are new or unique, unless you ignore the last few decades of lessons from engineering concerns in other ecosystems, which unfortunately people seem determined to do.

Problems that Node/npm deals with that are relatively new/unique:

- Distributing modules that work in both the browser as well as on the back-end

- Working with and developing an ecosystem for a language that wasn't really developed for years and years, was still missing quite a bit of functionality, and then suddenly gained a lot of traction

- That language having to catch up on years and years of developments in computer science, and having to do so in a backwards compatible way

These are real problems that have influenced and caused a lot of the perceived idiosyncrasies in the Node/npm module ecosystem, and that do not have simple solutions that can be simply copy-pasted from other languages or ecosystems.

> Distributing modules that work in both the browser as well as on the back-end

You could say that they are two different runtimes, think CPython / pypy

Sure, but though I'm not that familiar with CPython and pypy, I don't think there are many situations where those runtimes have as varying and sometimes conflicting requirements as Node and the browser. I'd guess the differences are more similar to the differences among the different browsers alone, which already are different runtimes by themselves.

That said, even if those runtimes are as different as Node and the browser are, it's the interplay of that restriction in combination with the other characteristics I mentioned that make this a new challenge in its own right.

The python distribution infrastructure lags far behind that of javascript.

> None of these problems are new or unique, unless you ignore the last few decades of lessons from engineering concerns in other ecosystems, which unfortunately people seem determined to do.

Not intentionally you see. This is completely anecdotal, so take it with appropriate grains of salt, but I have observed that the majority of people is the JS ecosystem are not proper students of computer science/development. Most are either self-taught or come from fast paced bootcamps. So they lack the historical knowledge of software engineering and basic passing knowledge of other ecosystems.

Now for the most of us that have spent time studying the field, it is easy to identify the problems and at least remember that some solutions exist. Most of the people in JS community do not. So they go through the same process others have gone through before and land in the same mess.

There is nothing anyone can do about it.

This is a great insight. This kind of explains my constant surprise when I hear concepts I was taught in university reappear in the JS ecosystem with different names, and with great fanfare. In a way it's great that these developers have the ingenuity to rediscover these things on their own, but I can't help but wonder whether this is a case of those not knowing history being condemned to repeat it.

JS developers sometimes have the opposite impression, that computer scientists have conspired to repeat history in order to make the ecosystem look like something they know.

CS move forward one generation at a time. The less CS you know the more likely you are to spot bad practices. And the more naive you are the more likely you are to make a solution. Sure it might already been solved 30 years ago - but it might not have been the right time.

Oh please... this is not nearly the case anytime recent. If anything, new CS grads are more prone to reinvent some solution from scratch than to reuse existing solutions.

The main issues are that browser tech has advanced rapidly and is/was a non-homogeneous runtime environment. Also JS eco is built from OSS by an enormous pool of developers.

There is a difference between implementing a reinventing a new solution with knowledge of the past mistakes that prevents those mistakes from occurring again and a new solution without knowledge of the past mistakes, which leads to those mistakes being repeated again. In my observation, the JS community tends to go towards the latter.

Could you share some examples of these failures please?

The community as a whole has the kind of immaturity that leads to repeated and terrifying security incidents, for one.

"Modern" is another meaningless software-project marketing adjective like "blazingly fast". I'm pretty sure we'll eventually be able to use these buzzwords to perform carbon-dating of old projects.

"Hmm, uses the term 'modern' but the README has limited image macro memes... seems like an early 2019 release."

Jest, FB's JS testing framework, is described as blazing fast but is actually twice as slow as Jasmine (or was at some point). If twice as slow is blazing fast, I don't know what twice as fast is. I do know 'blazing fast' is meaningless. It's just deceptive.

For reference: https://github.com/facebook/jest/issues/6694

OT, but I was using Jest and switched back to Mocha. Since my tests are nothing special and 100% browser env they load instant. Should have done sooner

I made liberal use of jest's expectation framework and module mocking. Migrating is 'doable' but at this stage in the project I can't justify the delay on the deadline.

I've got about 200 integration tests too with a package that builds itself into an OS dependent state so I'm stuck running things in a VM. Probably easier to mark up the tests so I can run unit tests most of the time. That should make things bearable.

In this case, I think "modern" means "uses ES modules", which have somewhat recently become standardised, but have not (yet) been adopted by all npm packages for various reasons: https://news.ycombinator.com/item?id=19294189

I think it's simply a way to distinguish between old style JS ecosystems and new. It encompasses a range of benefits and lessons learned in JS. Of course not everyone is going to agree that the "modern" way is the right reaction to the problems of the past but most active JS programmer will have a good idea what it means.

C++ has a similar issue. There's a very distinct difference between C++11 and what came before. And C++20 could be another big shift (especially with modules). Modern has a useful meaning, even if it's fuzzy and temporal.

Whenever I see that word, I am reminded of a book I have titled Modern Welding --- published in the late 60s, so I guess it has always been a pretty meaningless adjective; but it does seem the JS (and web development) community in general likes to use that word a lot.

In the sense they're using it here it's perfectly valid. ES Modules are the modern replacement for the old legacy JS module systems.

It's valid, but not interesting. The "old legacy" system was once modern, too. And before that it was <script>. Better to talk about the technical benefits of the solution instead of how new (i.e. modern) it is.

ES6 modules are neat, but they are more verbose, not more compact. And they kind of cripple some of the nifty metaprogramming capabilities that actually made Node.js cool from a Unix perspective.

Excuse my ignorance, but was not it already easy to do these operations? Seems like yet another centralized package repository with a supposedly visually appealing interface, where this interface is done just a little bit differently.

Edit: apparently it is not a package repository, my bad.

Could someone please explain to me what exactly it is and its use cases? Is this supposed to replace some of the features of npm?

It doesn't look like a package repository. It's just a UI for the npm repository and some type of build tool.

At least I got the second part right! :D But yeah, I guess the author was not content with https://www.npmjs.com? Is it even supposed to replace it? I have no idea.

Seems pretty clear to me: it's a way to search for npm packages that are published as ES modules. You can't filter by that on npmjs.com, so this does fill a need. The other parts of the project help you publish packages in ESM format, which can be a pain to do manually.

Why not fix npmjs.com instead of scattering things around ad infinitum?

Only the NPM Corp can fix NPM. Anyone can create a different website to aggregate it's results.

Do they allow people to send pull requests though? I know that it does not equal to accepting it, I just wonder if it has been attempted.

> Anyone can create a different website to aggregate it's results.

Yes, I know, and it is fine. Just wondering.

You can't send pull requests if it's not open source.

Oh, for some reason I was in the thought that it is. Sorry!

There is an open source implementation of the repository, but the website as far as I can tell.

These packages are still on npmjs.com. This is just another way to search the repository. The author can't fix npmjs.com because I'm assuming they don't work for npm Inc.

The nodejs modules are what makes nodejs great. It encourage sharing code and modularization without complexity.

What do you dislike about ESM in comparison?

ESM had no implementation, and there where already a unofficial standard. Why did ESM land in ES2015 ? Three years later it still got issues and implementation is experimental at best.

They are supported by browsers for about two years now: https://jakearchibald.com/2017/es-modules-in-browsers/

But why is nobody using that?

Specifically, I would think 99% if stuff being shipped today is bundled with tools like webpack.

Because it lacks support for bare specifiers and package lookup. There’s also no dead code elimination or code splitting.

These are still node modules, and they're still on npm. They just use ESM syntax as well as CommonJS.

Top notch name.

Terrible for Portuguese speakers, as it sounds like a popular alias for penis

So does ‘cock’ in English, yet it also describes the bird. Seems like pikas are still pikas in Portuguese anyway: https://pt.wikipedia.org/wiki/Ochotona

Why call this the same as a very popular AMQP library?


As a user of pika it confused me. I wish the author had named it something other than "pika".

From the news I hear (which is quite slanted, I will admit, since I write very little JavaScript) the issue with npm is that it's too easy to publish things on npm and use them, which leads to a dependency mess and breakage when things are removed or get hacked. Is this something that the JavaScript community needs?

I don't think that npm itself should be blamed for being too easy to use - That's a good thing in most cases. I think that the main problem is that a couple of years ago some very vocal members of the Node.js community had been promoting a hard-line philosophy around publishing and using tiny modules.

The consequence of that is that projects ended up with hundreds of tiny dependencies (and sub-dependencies) which increased the attack surface and introduced their own bugs and/or vulnerabilities.

I think that the Node.js community is wiser now. Vulnerability detection tools like Snyk.io have been useful in encouraging module authors to remove unnecessary dependencies from their modules.

Now the trend seems to be to use a fewer modules which offer more functionality that is more closely matched to the use case.

OK, but this behavior can be observed in both the Python and Rust community, too (maybe other communities as well but I am not in touch with them). Do they promote "a hard-line philosophy around publishing and using tiny modules", too? I had to cargo build a few projects (independently) (e.g. parity-ethereum, c2rust), and it took a while because they had over 300 dependencies. That is a lot. What is the reason for this phenomenon?

On the spectrum, Rust is not as extreme as npm but is closer to it than not. It just really depends.

Smaller dependencies are easier to maintain, test, and understand. Rust also has a relatively small standard library and so you tend to rely on packages (some produced by the rust project itself) for some things you might use the stdlib for in other languages.

Compared to PyPi it looks like npm packages are much more granular. You'd see the functionality of a popular Python package spread out into multiple npm packages.

Both approaches have their advantages. I'd say that for security and reliability, you really need to know what packages you are running. Often you can delegate the responsibility to bigger upstream projects/groups.

For example if Facebook works with and on React, you can put a good lower bound on the reliability/security of React and the packages it pulls in. I'd be a lot more suspicious of packages which are rarely used by significant other projects.

There is a cost to relying on someone else's project. If instead of relying on 1 library you rely on 10 this is a pure negative in terms of complexity, risk, communication and potential breakage.

Contrary to your statement this is pure disadvantage.

"For example if Facebook works with and on React, you can put a good lower bound on the reliability/security of React and the packages it pulls in."

I don't think this is true. You could easily depend on something that react pulls in which they later drop months before it turns into a vector for malware.

I don't see how trust translates down the dependency graph AT ALL.

Bugs and security risk seem to be mostly correlated to the number of lines of code. By splitting a package, keeping the volume constant, the risk shouldn't increase that much. Small packages have less of a chance to cause problems.

Nothing is perfect. NPM and PyPi try to mitigate this problem with security audits and notifications. NPM checks your project for known vulnerabilities at every install.

If you're paranoid, you just don't upgrade packages unless you really need to and audit stuff yourself. That comes with its own costs. As does writing the software all by yourself. Or buying it from commercial vendors with similar tradeoffs applying.

I'm not sure why you're being downvoted. I've lost days to debugging issues introduced to modules in deep transitive dependency chains. It wouldn't be so bad if package maintainers respected semver or those upstream took care to lock their dependencies to a specific version. In practice neither happens.

I don't know, did you try clicking the link after replying to the title? Or is it time for another generic top-comment thread about npm?

I did, but I'm not really sure I know what this. Is looks like it's performing filtering for "ES modules" on npm?

That's what it does. The reason it does this is because npm modules distributed using the "modern" (as in recently added) module system can be more efficiently distributed to users by allowing you to only import parts of a package that you actually use.

The reason not all packages support this, besides legacy, is that this also requires your runtime environment to support and benefit from this. In other words, this is useful when you're targeting modern browsers. When a package can potentially also be used in Node projects, or projects that require support for relatively widely used browsers such as Internet Explorer, however, supporting this module system might not be possible or worth the effort.

In other words, it has absolutely nothing to do with it being too easy to publish to npm.

Applications are open for YC Summer 2019

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact