Anecdotally, it seems like a lot of Yarn adoption is still suck on 1.22.x. Migrating to 2+ appears to be untenable for many teams.
I understand that Yarn 2 is essentially a new tool, as there are certain features you simply can't ship while still being compatible with npm, but this hasn't resonated with developers. The initial appeal of Yarn 1 was that you could pretty much drop it into your node project and get much faster install times. `pnpm`, from what I understand, is faster than yarn, while still having that seamless interop w/ npm.
While I personally enjoy Yarn 2+ a lot and prefer it for private projects (I'll explain a bit about my setup at the end), for a long time both the docs and the implementation pretty much guaranteed that it would fail:
- many basic things were missing from the docs for a long time, or at least were easier to find in Github issues (e.g. "what should I gitignore")
- defaulting to PnP meant that, without configuration, things wouldn't properly work on many projects based on frameworks/toolkits/editors
- opinionated defaults made things harder than necessary (e.g. why does `yarn init` create a Git repository?)
You either had to accept/work around these issues, or configure everything to work as e.g. npm or yarn 1 did. I did the latter, and am still happy that I bought in - but I can very much understand anyone who didn't.
My preferred setup with Yarn 2+ projects is:
- use the node-modules linker (I tried pnp every couple of months and always ran into some issue, which I sometimes sent PRs for, but there was always something not working)
- gitignore as described for zero-installs[0] (even though I don't use pnp; this has the advantage that you'll have a ZIP copy of every module cached, leading to very fast local/CI installs and resistance against registry issues)
- create .gitattributes file as described at the end of [0] to hide the binary files from diffs
- add the base SDK[1] for WebStorm support (this rarely worked well)
I cannot express how painful yarn pnp has been for us. We've been stuck on a relatively pretty old version of Typescript (4.8) because upgrading it with yarn is so difficult. You have to contend with Yarn patching Typescript, and requiring obscure IDE integrations so it all just works. We still can't update, because we can't get Typescript + Yarn PnP + VS Code on macOS to work together. It has been such a massive time sink for us.
I see no reason for anyone to use Yarn. Either use vanilla npm, or use pnpm which doesnt require a mountain of hacks to work.
We've been running TS + PnP + VSCode on MacOS throughout the entire lifetime of Yarn, through versions 1 - 4.
The plugins system has been extremely valuable to us, and the hoisting / peer dependency behavior has been consistently correct, where other package managers have caused bugs.
On top of the really different API, you always get v1 when you install (which I never understood).
You need to upgrade manually everywhere with a command line.
If I had to bet: most user didn't even bother upgrading because they are not aware of this behavior.
> you always get v1 when you install (which I never understood)
Yeah, they really wanted you to install yarn's source to your repo, which is why they don't let you `npm install` yarn v2 (or higher). That's maybe great for monorepos, but personally I find it ugly.
Thankfully now you can use `corepack`, which is shipped with npm. This way you can specify a version of yarn in packages.json, in the `packageManager` field. That is, without committing yarn's source to your repo.
Same here. I haven't done frontend in a few years now but I was shocked when the top comment was saying most people don't even use Yarn 2. I had no idea Yarn was past 1.x.
Always having the right version is nice. However, you can get similar benefits with a shorter bootstrap script. Personally I prefer this because there's less third party code duplicated directly to your own repo.
Yeah I followed the same path. I originally switched from npm to Yarn 1 for the speedup, but then Yarn 2 broke enough things that I never managed to upgrade.
pnpm is (at least) as fast as Yarn 1, and they managed to do it with a much better compatibility story (in my experience)
I’m not stuck on Yarn 1, I’ve chosen to keep using it because it’s the best tool for my usage. I tried Yarn 2 and found it unsuitable, because:
- it patches packages to work with its PnP system
- even if you opt out of PnP
- in ways that are opaque for me as a user
- and in ways that break usage, eg it broke a beta version of TypeScript because the patch it wanted to apply wasn’t available for the beta
I’m sure the PnP feature is brilliant, but I don’t want to use it and I don’t want its behavior to be applied when I don’t use it. So Yarn 2+ is not an option for me. And since Yarn 1 is still supported, and meets my needs well, I continue to use it by choice.
Fully supporting the anecdata, different projects at my current company use all of the package managers — npm, yarn 1.x, pnpm, bun — except for yarn 2+
In personal projects I heavily gravitate towards bun as a package manager now, it's npm drop-in replacement but faster (and typing `bun run` is just so cute compared to `npm run`).
I'd love to use Bun for my projects, but it's not integrated into Corepack yet (and therefore you cannot pin the bun version w/ checksum in package.json)
yes bun as a package manager is great. I wouldn't advise anyone to run a production app with its runtime though. "drop-in replacement" for node was a lie with how many bugs and how much unimplemented stuff there is
I've been chatting with Jarred and it's obvious they care and want to make it better. But yes, the "1.0" release and claims like this hurt the sentiment for Bun. Hope it gets better, it's a seriously impressive (but monumental) undertaking.
Neither Yarn 1 nor pnpm have seamless interop with npm, but pnpm is more clear about this and you get the benefit of it's linked install strategy (which npm is adding, btw).
People really should get off of Yarn 1. It's not maintained, has a poor package locking system, and bugs will simply never be fixed.
I feel a bit scammed by yarn because I started using yarn when it came out and developed a muscle memory and appreciation for its CLI, and they just completely left the Yarn 1 users hanging while they have been messing around with the Yarn 2+ experiment. I don’t want to install Yarn 2 because it’s a weird hack, I just want them to go back to focusing on Yarn 1 goddammit.
I stayed away from yarn@2+ for years, but finally switched to 3.2.1 when I started working on a monorepo with nested workspaces.
I've actually been really happy with 3.2.1. It and wireit together make it super easy to work on a project that spans various libraries/repos. PnP was a compatibility nightmare, but node-modules mode works.
Using the yarn pnpm linker seems like the worst of all options. It's clear it's not where the yarn team's focus is, not many people use it (relatively) so you're more likely to run into an undiscovered problem, and then it's still significantly slower than actual pnpm.
Switched over to yarn a while back. I’ve been developing some front-ends without even knowing yarn 2 was out there… The other day I started a new Next.js project and reached for yarn only to realize that it had jumped two versions (three now, apparently) I had to learn a bunch of new shit. I was like, nah, back to npm for me.
For comparison, not long ago I switched from pipenv to poetry and that was pretty much drop-in, with nevertheless markedly incremental improvements.
I love UI development. If only I could get to building the actual thing…
second this, I have aliased pnpm to my npm for two years, worked well and very speedy, saved lots of space too. I wonder how npm will borrow the new ideas from pnpm and yarn, fast.
In the last few places I worked we dropped Yarn for pnpm. Did the job and now dependabot can understand it too, on github.
Corepack is a nice addition to Node to make it easy to install a package manager but it's also an unwelcome layer of abstraction (why isn't there just one?).
This is exactly what I did. I even participated in the Yarn project a bit before it was fully open, but as npm 6 and 7 brought major speed improvements, workspaces, and overrides, I don't see the need for an alternate package manager anymore. As a library maintainer using the same package manager that most of my users do is a pretty big help.
I'll be very happy when npm's linked install strategy is stable so that there's less reason to use pnpm and we get dependency isolation in monorepos.
> I don't see the need for an alternate package manager anymore.
I do see a strong need for lerna though. Lerna's ability to manage concurrency, topological sort-based builds and streaming builds is phenomenal and an absolute necessity in monorepositories. It is unfortunate that this functionality is not available by default in npm in any way.
Wireit's ability to specify actual script dependencies, do caching (and on Github actions), and its long-running service script support make it much more useful and comprehensive than Lerna.
That's me. But for some projects I've had to switch back to yarn, because npm no longer works to install node_modules, and I don't really care why. Yarn works on those projects, npm does not. If npm works for a project, I use that, but there are some where yarn works and npm simply errors and quits.
I am not sure when to use `npm workspaces` instead of lerna now? It seems like the answer (for me at least) was "you don't need lerna anymore given recent (past 2 years) NPM updates"
Yeah, my major foray into Yarn was to create workspaces for a monorepo without adopting other larger libraries with significant buy-in. I was glad I did it, and my team was, but it was a super-hairy experience trying to get all of the tooling happy within workspaces. I'm sure it's all a lot nicer these days.
the reason to try yarn for me was berry pnp to avoid those pesky node_module folders
but alas the experience has been bumpy on personal projects.
for something that has promised more robustness, it has been anything but that.
I still like the yarn cli more than the npm one, but I ended up just defaulting to npm.
Same here. Npm seems to have fixed the lock file issues it had, which was the main reason. With caching, npm is fine. On large projects npm ci can take 2 minutes - we leverage node modules cache on the lock file hash, and it’s typically fast.
Npm served well but recently I've being noticing that I don't need NPM at all. For the frontend ESM modules from urls works very well. For the backend, deno can be a good alternative if it means not needing to use npm anymore.
Yarn made the same mistake going from 1 to 2 as Angular did. The newer version was so different. Both should have been named something else. At a certain point of incompatibility and divergence it is a different product, and the benefit you get from using a well known name is outweighed but the cost of confusion.
Yarn 2 brought a lot of great improvements but since it was named "Yarn" people were expecting it to behave somewhat like the first one.
Sometimes I wish for a SemVer iteration that does away with the major version, and requires a project to create a new library wholesale if they want to increment the major version. Every library would have minor and patch releases, but if they want to substantially change the API, they'll have to create a new library instead and ask people to move to it instead.
The problem with semver is what constitutes "substantially"; because it's been left to subjective interpretation, we end up with these weird situations. What may invoke a feeling of "This should have been a new library" is different for each team. Even a minor change with a deprecated method can invoke such a reaction depending on how a team has been using that library.
No, c'mon, that's taking it too far. React, for example, has been doing a wonderful job. They deprecate during one major version, and then maybe they'll remove in the next major version after that. Usually stuff just keeps working. Maybe you have to add an `UNSTABLE_` prefix or something here and there to maintain backwards-compat, but all the breakages have been very minimal and they give you like a year for you to clean up your usages of any deprecated functions so that that your next upgrade is seamless.
Except that forces people to always try to stay bleeding edge, otherwise upgrading in the future becomes a huge hassle.
Compare this with Clojure where you can almost guarantee that you can move between any stable version without having to change your code, even if you jump 4-5 versions.
Yarn lost the plot when going to v2. It was very different from v1. Even attempting a migration was painful enough to write it off completely in the projects I was working on. I've yet to come across any new projects using Yarn since then. Now, I think it is too late. pnpm has more or less caught up and even surpassed any advantages Yarn had, with a much more pragmatic compatibility across multiple platforms.
Personally, I've moved on to pnpm (with npm as a fallback when I can't use pnpm).
a) pin your dependencies, set save-prefix='' in .npmrc, use a tool like pnpm outdated, Renovate, Dependabot, npm-check-updates to keep them updated. Doesn't need to be done or enforced in the package manager, Git Hooks and CI are sufficient.
b) use syncpack to ensure all your dependencies in different workspaces use the same versions. This doesn't need to be done in the package manager.
pnpm also has pnpm licenses to help keep compliance happy.
If someone wants to write a package manager in Rust for speed (pnpm is already moving in this direction, see https://github.com/pnpm/pn and https://github.com/pnpm/pacquet ), we'd take a look. Otherwise - not enough benefit to switching away.
I'm just trying to switch to pnpm and I'm a bit lost on how to dockerize packages in my workspace, something that I thought would be a common thing to do.
The example in their docs [1] seems to just be wrong
This of course will not work, since the node_modules folders of packages in a pnpm workspace just contain symlinks to the virtual store at the root of the monorepo, not the actual modules.
I don't see any good way to do what this example pretends to achieve (minimize docker image build size) since even when hoisting is forced, pnpm seems to only ever hoist node_modules in the root of the repo.
Sorry for this somewhat unrelated rant, I was just surprised at hitting such an obstacle immediately after trying to adopt pnpm after I heard so much praise for this tool.
Checkout Depot's example Dockerfile for node + pnpm: https://depot.dev/docs/languages/node-pnpm-dockerfile which is best-practices based even if you're not using Depot. They also explain each of the lines in the Dockerfile.
As someone struggling with Docker caching at work at the moment, I think your problem is less with pnpm and more with the difficulty of writing decent Dockerfiles.
This won’t work in the case of the workspace setup since the node_modules folder for my app just contains symlinks, not the actual node_modules, that’s the essential problem I’m facing.
Right, what the Dockerfile is doing is building the store first inside the image using pnpm fetch. Then, when the Dockerfile does pnpm install inside the image, the resulting links in the generated node_modules are pointing to the store path inside the image. When you do COPY . ., either your .gitignore file is already ignoring node_modules or you have a separate .dockerignore file which should also exclude node_modules; thus, when copying files into the image, you're never copying the node_modules folder and thus links inside the node_modules folder to places on your host laptop are irrelevant.
Hm, thank you for the explanation but I'm still not 100% sure how it would apply.
> thus links inside the node_modules folder to places on your host laptop are irrelevant
I'm not talking about the host computer at any point of this, this is about the workspace setup, all of which happens inside the image, since packages within the workspace depend on each other the whole workspace needs to be built together in the image.
I am willing to admit that I am missing something but I'm just not sure what exactly.
Whenever pnpm installs workspace dependencies, it installs them at the root of the workspace, (in the store inside the docker build image of course, not on the host), and those are the dependencies for all 3 packages all together in one virtual store, here:
./node_modules/.pnpm
So, when I want to create my container image for say packages/app1, I don't see how I could copy only my dependencies for that app from the build image like this:
Because while of course the dependencies are installed in the build image, they are at the root of the workspace in virtual store there, and not in /app/packages/app2/node_modules/ – this directory only contains symlinks to the root virtual store.
Of course I can copy all the dependencies from the root virtual store of the build image into my image, but then those are the dependencies for ALL packages in the workspace, not just for app1
I suppose I could try to install only the dependencies for app1, but this is broken with the default pnpm settings at the moment (it still installs dependencies for everything in the workspace)
After talking to one of the contributors on Discord, it seems that they have a special "deploy" command for exactly this (copying files and dependencies for a single workspace package) which I had overlooked since the documentation for it wasn't so self explanatory, they have now updated the docs for this command [1] and opened a PR to update the docs for the Monorepo Docker example to use it instead [2].
I have to say I'm impressed with how responsive the maintainers were to my question, and this `pnpm deploy` workflow does actually make sense to me.
If you're concerned about copying the entire virtual store into the final production image, then you could also run https://pnpm.io/cli/deploy to get a smaller dist folder suitable for copying into another container layer, did you try that?
I'm considering this path, yes – just find it strange that the docs have this example that clearly can't work due to how the package manager functions.
Adding bundling with another tool like rollup also brings another layer of complexity, now I'm starting to feel like I should just use bun.
pnpm is more than just pnpm install, there's an evolving ecosystem of tools. For a mainstream project looking for tools to depend on, either everything supports bun, or it might as well be nothing.
Same. I was using yarn 1 until a couple years ago. Now I just use npm. What are the downsides? I have no idea. If there are, they must be negligible because the apps work fine, customers are happy, and devs are able to keep building. We must not have any crazy dependencies because I don’t see much performance gains. At most we might link other packages as we develop.
I think if I really went searching for problems I might find something. I also just try to avoid JS these days unless it’s UI, and there I keep things very lean and straightforward.
The sole reason I haven't migrated to NPM to be honest is the lack of a replacement for "yarn run".
I use "yarn run" incredibly frequently, for things like "yarn run nodemon" and "yarn run tsc" or other executable packages that are local dependencies.
`npx <package>` does run project installed binaries, if they exist (node_modules/.bin/<package>). In that scenario it's equivalent to `yarn <package>`. Otherwise yes it falls back to a global package, and a prompt to install if missing.
That's the thing with tools like this. Eventually whoever you one-upped just incorporates your features and now they have the edge again - your new features and their greater experience.
Yarn 2.0 was ambitious, but in hindsight, it probably would have been better to make regular node_modules the default rather than pushing PnP and zero installs (and alienating most users in the process).
I think Yarn Berry with node_modules linker is strictly better than Yarn 1.0, whereas PnP and zero installs involve tradeoffs that might not be right for everyone. There's a lot of great design decisions in Berry that gets muddled with all the PnP-related discussion. I will say Maël deserves a lot of credit for driving corepack, which makes version control of package management totally seamless. It's always been a total nightmare but now it's shockingly easy.
Yarn's versioning is really confusing. Is 4.0 compatible with 1.x? Is it a successor to Yarn 2 (with the different functionality)? What happened to 3? Haven't really kept up with this since 2 wouldn't work with any of my projects, not sure where to pick up again now...
I mean... Why would it be compatible with 1.x if 2 wasn't? There is an upgrade guide to upgrade from 1 to 2 and later. 3 has been out for a long time. Not sure how their versioning is confusing. It uses the major version number to signal (possibly) breaking changes.
IME most projects use semver to indicate possibly breaking changes that might require a manual fix or three, not a whole different project that requires 100% changes across the board and starting over from scratch. Even node itself doesn't just kill your projects with every major version.
Judging by the other comments in this thread, and the blog post saying that "zero-install is disabled by default", I had kinda hoped 4 would be a reboot of Yarn v1 with modern conveniences. This does not appear to be the case.
I'm still not sure whether it goes like "2.x -> 3.x -> 4.x are all in the same line, and upgrading from 2 to anything beyond that is easy" or if they are each as different from each other as 2 was from 1. That's the confusing part. (Like, did the 2.x line stabilize into 4.x, or are these just four separate projects altogether?)
I think it would've been a lot clearer if they just named Yarn 2+ something entirely different, like Moment did with Luxon
As a Bun convert, it's interesting that they felt the need to have a preemptive response for Bun as part of this major release. The response itself seems unconvincing though: Basically, Bun is a lot faster and simpler, but they think they can catch up.
I'm not convinced they can catch up even on speed, and Bun ergonomics are also a lot nicer from the get go.
Funny enough, I got our work project working with yarn berry (whatever number this was this summer) in 10-15 minutes, but I couldn't manage to do the same with bun. I'll try again now after they got few minor updates probably just to see if its better now...
Converted existing large NPM workspace-based projects. It depends which part of Bun you're talking about. Do you mean the package manager didn't work or the runtime didn't work? I would be a bit surprised if the package manager was not a drop-in replacement, but I know they recently fixed a lot of bugs too.
Both. This was a while ago (early this year, late last year?). I didn't care about the runtime, but the package manager kept crashing, and I spent a few hours digging through the error logs and trying to fix each error one by one, but never got far enough to actually successfully build my project. (It was just a basic static-file TypeScript app with lots of older dependencies).
I unfortunately don't have the logs anymore. I'd be happy to try Bun again on new personal projects, but I'd be afraid of using it for any real-world work project at this point; the risk of having to spend time debugging it isn't worth the performance improvements (since the packagers are usually just run on CI/CD anyway, and local `next dev` or similar is already fast enough).
Actually now that you mention CI, I remember now that I did also run into non-deterministic problems running type checks on projects with Bun using Github's free runner.
Errors looked like this, but it worked fine every 2/3 runs with the exact same code:
`tsconfig.json(5,25): error TS6046: Argument for '--moduleResolution' option must be: 'node', 'classic'.`
It was fixed by just running on my own runners instead, and I never hit those errors building locally. Very weird.
Great release which solves the biggest painpoints with the latest Yarn versions. I still like zero-installs and yarnPath providing Yarn for anyone using the project, but not having zero installs and using corepack to provide Yarn seems like the right choice which reduces friction for adoption.
Also, not having to install plugins such as interactive-tools is sooooo good. I have no ideia why these are plugins in the first place as they are so essential to using Yarn.
I still wish PnP faced better adoption though. In most projects I still go back to using the pnpm or the node-modules linker because of various issues with PnP which are sometimes hard to debug.
In other notes, why the hell in Node corepack still experimental and not enabled by default? I simply don't get it.
As an outsider to the JS world, can someone give me the quick pros and cons between yarn and npm? Can you switch back and forth in the same project? Is the end result of your package structure the same?
The end result is essentially the same, yes - all directly requested dependencies, and all of their transitive dependencies, are extracted and installed on disk into `./node_modules`.
Yarn came out at a time when NPM was particularly slow and buggy. NPM has caught up some since then, but I _personally_ still like Yarn better. I find the CLI output more useful, and it feels like it installs in a more consistent amount of time.
You should only have one of them in use in a project at time.
PNPM is another alternative package manager that installs the same packages, but tries to use symlinks to a single globally-cached copy of each package to save disk space. (I believe recent versions of Yarn have an equivalent option, but it's not on by default.)
> The end result is essentially the same, yes - all directly requested dependencies, and all of their transitive dependencies, are extracted and installed on disk into `./node_modules`.
Well, no. Not if you use Yarn's zero-install or PnP or whatever they call it. It doesn't create node_modules and thus it's not very compatible, and that's the problem everyone is complaining about.
They each generate their own lockfiles that contain the dependency graph so they can’t be used together. They have different utilities in the CLI for example- NPM has npm audit, which yarn doesn’t. They’re mostly the same but you have to keep in mind they have different philosophies in how to manage the dependency graph.
TLDR: Choose one, stick with it, try not to have multiple node versions per machine/VM if you can help it, and update & test your packages in small frequent increments if you can. Otherwise it gets to be a nightmare real quick.
Yarn 1.x came out because NPM was really slow and buggy back in the day. Yarn was fast and buggy. Then Yarn 2 came out, was even faster and less buggy, but not backward compatible. I've never actually seen it used in the wild, even now (it's just yarn and NPM on all the projects I've worked on).
Bigger differences will come trying to upgrade multiple packages at once, especially across major versions (e.g. 2.x to 3.x). It's going to be hard no matter what, but `yarn upgrade-interactive` makes it a bit simpler, while npm has third party plugins that do similar things. There's also complexities involved if you need to switch the underlying `node` version across projects, since npm and yarn both need different node versions depending on their own version.
Their final outputs are similar (a node_modules folder where all your JS dependencies and sub-dependencies live), but their internal workings are different, and you shouldn't use both together because they will conflict with each other (sometimes).
Generally, switching back and forth isn't hard on smaller projects: just delete the lock files and delete the node_modules folder and reinstall everything from scratch.
But you really don't want to get into the habit of doing that. There's no real benefit (just choose one and stick with it; either is fine these days), and you may introduce hard-to-catch bugs related to one of your thousands of dependencies. Especially on bigger projects, erasing the lockfile means that it will try to use the designated SemVer in your package.json to get the desired version, but that may be different than the actual version that was being used, which is stored in the lockfile as a manual override. I'm not sure why these package managers had two sources of truth, but it's a nightmare.
0. JS world is full of people captivated by shiny things.
1. yarn was a shiny version of npm
2. They did a Python 3 stunt (well...much worse) with yarn 2 so nobody uses it and everyone stuck with 1.x
3. Bugs aren't fixed in 1.x
4. So back to npm, or the next shiny thing.
Edit: no you can't switch back and forth. The tools end up storing internal state in your project repo. I mean, yes you can change, but it'll mean some hassle and possibly re-writing build files.
> 0. JS world is full of people captivated by shiny things.
I mostly write C code these days, so I decided to checkout node to see what it was all about, and the shiny thing is spot on. A number of things that I looked at were described as "old" by people because they had been replaced by the new hotness a few months back.
Don't let that dissuade you too much. Yes, the community likes shiny, but you're not forced to upgrade if you don't like shiny. Just pin your versions and be happy.
And it was so much better that if they hadn’t pulled the python 3 but worse stunt I suspect a significant number of projects would have continued using yarn even after npm had caught up in terms of features and performance.
Yarn 2 was just a colossal mistake. In nearly any real project it causes constant churn and is just a perfect example of what happens when you let some computer scientist do whatever they feel is nice from a theoretical perspective rather than based in reality.
There isn't even any arguing with it being an abject failure. It's just abundantly obvious when you look at how many people keep using yarn 1. These patch notes even prove it since they realized how insane it is to make their zero install thing the default.
The new JavaScript powered constraint engine looks amazingly simple & elegant.
We definitely suffer badly trying to make the very simple example shown happen across our projects: how do we make sure everyone is using the same version of React (for example)?
I'd be curious to know how this rules engine is usable outside of Yarn. The post talks about it being used at Datadog, seemingly on their app. But I didn't see or missed info on the package itself & how it might be embedded elsewhere.
Also super notable from this release, turning off zero-install by default. Still such a neat idea. Even though most package managers are reasonably fast now, there's still often multiple hundreds of MB of files copied out of cache, for each project, and the idea of just directly using the cache without copying stuff in feels like a real nice to have, a potential big saver of disk space.
Kinda surprised by all the "stuck on Yarn v1" comments so far...
We went through the same "~2015 everyone uses yarn", "~2020 everyone goes back to npm" cycle, but in the last ~year or so are back on yarn v3 for...reasons that I forget (oh yeah, the ability to download multiple platform libs for when you're running in docker [1])...but :shrug: it's been great.
My only complaint is that `yarn dedupe` should be automatic (as it was in yarn v1), or at least via a flag. :-D
But otherwise yarn v3 (with the node_modules linker) has been great, and look forward to using v4.
Wow so much negativity here, I just can’t imagine not having Yarn PnP for my project with 15+ packages all in a monorepo. It’s super easy to patch 3rd party packages, fast at installing and adding new dependencies, and it plays so nicely with Nix which is great for CI/CD, as with a bit of tooling all of your packages will be cached between builds.
Even some “heavyweight” frameworks which I didn’t expect to work, work fine, such as NextJS.
I'm also in the "still using Yarn 1.x" camp. The problem is that yarn is installed to the global path by a package manager from outside the Javascript ecosystem, which means that the same install of yarn is shared between all the projects I might run on my system, including projects I don't maintain myself and including projects that aren't maintained at all. I'm open to other JS package managers if they offer benefits, but the one that runs when I type "yarn" in my shell has to be backwards-compatible with 1.x, forever. Repositories like Debian/apt-get and homebrew operate on a similar philosophy, and still offer only 1.x.
So... why didn't they just put a package-manager version number in package.json? I would have no problem with projects requiring Yarn 2+, if upgrading to Yarn 2+ didn't create an obstacle to continuing to run everything else that still uses Yarn 1. As it is now, I'm not willing to even _try_ the newer versions, because I expect to have to revert and I'm worried that I'll lose a day to the mess that the uninstall/revert will create.
I find there is less room for yarn in the ecosystem than back when npm was crazy slow.
In my testing, npm has gotten a lot faster that it used to be and often it is faster than yarn 3.x when doing installs in a CI system. npm also has decent workspace support, excluding the important ability to do topological sort-based builds (which I still use lerna for.)
I also found bun install also works great and fast and is a drop in replacement for npm.
I came here to complain about how there are two JS package managers that are virtually identical, and learned there is a third. Fellas, sort this out. This is a waste of human potential.
(I also still think package managers should be language and ecosystem independent. We just need one way of getting named and versioned folders of files from the internet.)
My own experience is that Yarn 2/3 have worked great, as long as you stick with the "`node_modules` linker" option rather than the "PnP/Zero Install" option.
I looked into switching to yarn for the first time earlier this year, since I was having minor multi-platform issues with npm and was pretty sure yarn could fix them. Since the docs seemed to make PnP/Zero Install the "blessed path," I tried to go with that. I quickly became confused at the correct way to configure certain aspects for my org in conjunction with our build pipelines, decided it was way too much unnecessary complexity to teach to my org, and kept our npm setup.
Getting it to auth properly with GCP artifact repository in GitHub actions was a bit of a pain. Conflicting documentation between Google, Yarn and GitHub didn’t help.
But isn't that nearly defeating the entire point of the newer versions? They basically tried to go around the entire ecosystem, and having to not use the main feature that drove the development of yarn berry just seems like proof that it was a mistake.
There's other improvements too. Anecdotally I've seen Yarn 2+ install faster than Yarn 1. The UI output is more informative. The workspace behavior has been pretty solid. The "install Yarn 1 globally, Yarn 2+ per-repo" bootstrapping behavior is a bit quirky, but it does make it nice that everyone using the repo is using the same Yarn version for actual execution.
I like the _idea_ of PnP conceptually, but my experience was that there's too many other tools that depend on having `node_modules` on disk for things to work out okay.
I was a yarn 2+ convert because of how much faster installs were (and the corresponding reduced disk bloat). It unfortunately doesn't play well with Angular, and lots of people fall back to the "node_modules" strategy which does away with much of the benefits.
Is there any good comparison out there of the JS package managers, and the trade-offs they make? I happen to really like the disk/time vs compatibility trade-off yarn 2 makes, but not the zero-install (which had been the default for a time). But hearing that Bun is faster, and better ergonomics? And other people mentioning pnpm? That makes want a package manager in depth comparison.
I want to take a minute on rant about the insanity that is having node_modules used a caching location. That's the big reason for the incompatibility for between npm and any package manager that attempts to clean up the node_modules mess (though the impact varies by solution). Maybe I'm just spoiled, coming from java, where you can expect your libraries to be in a read-only archive, and that tools separate the runtime from their cache.
Just use pnpm. It uses the same node_modules so you get 100% compatibility with all sorts of frameworks/build systems/IDEs, but stores all files in a single global content-addressable directory, and hardlinks everything from there (or reflinks if your filesystem supports it). It also hides indirect dependencies so you don't import them accidentally (you can still use a flattened directory if you need it).
If you install libfoo@1.0.0 and libfoo@1.0.1 and the only difference between them is a single file, the second copy will only add that file to disk.
If you then install that libfoo@1.0.0 into a hundred projects spread across your disk, no files will be copied (both reflinks and hardlinks only add a bit of filesystem metadata).
I’ve been using the new yarn (with workspaces and Plug’n’Play) on a reasonably complex JS project for about 2 years and I think it works great. Congrats to the yarn team on this big release.
I've been waiting for this for the .env support for .yarnrc.yml. we can now have our GitHub PATs in a .env instead of having to add it to your shell env or risk leaking it or putting thr PAT directly in the .yarn.yml
Why can't the JS ecosystem simply accept its non-complexity, deal with it and be done?
I know my share of frontend technology, React, Angular, AngularJS, MUI, Bootstrap and so on, Vue, Next, Nuxt, Svelte. Others.
None of it is really complex, there are competing, but increasingly incorporating patterns at play, just fighting for views, stars and clicks.
In the end, it boils down to the ever same boilerplate JS feeding a browser.
Please don't misunderstand me: JS can as complex as any "turing-complete"¹ language. But the things which seem to bother us have been solved over and over again, and there are no new solutions in sight, but only reformulations.
¹ there is no real turing-complete language out there, since memory is limited. Flamewar inc.
The history of yarn is fascinating. Dependency management in our monorepo is super smooth since we use it. We are on version 3.6 right now.
I admire that arcanis tried something very bold with plug and play and zero installs in version 2 and 3, but is willing to default back to node_modules since it didn't stick. Must have been hard to come up with something this good, but then nearly everyone rejecting it.
I understand that Yarn 2 is essentially a new tool, as there are certain features you simply can't ship while still being compatible with npm, but this hasn't resonated with developers. The initial appeal of Yarn 1 was that you could pretty much drop it into your node project and get much faster install times. `pnpm`, from what I understand, is faster than yarn, while still having that seamless interop w/ npm.