
Snowpack 2.0 - pspeter3
https://www.snowpack.dev/posts/2020-05-26-snowpack-2-0-release/
======
orf
Some of this wording confuses me and should probably be reworked:

> Snowpack is a O(1) build system… Every file goes through a linear input ->
> build -> output build pipeline

Seems like O(n) to me?

> Snowpack starts up in less than 50ms. That’s no typo: 50 milliseconds or
> less. On your very first page load, Snowpack builds your first requested
> files and then caches them for future use

So you can open a socket in 50ms? Seems disingenuous to imply anything takes
50ms when really you’re just waiting until the first request to do anything.

Looks like an interesting project though.

~~~
dandelany
Maybe I'm misunderstanding, but it seems clear to me: Other bundlers = change
one file, `n` files are rebuilt/bundled. Snowpack = change one file, only that
one file is rebuilt. Building "from scratch" will necessarily be O(n), but
incremental rebuilds can be O(1), no?

~~~
chipperyman573
An incremental rebuild will still be O(n), but in this case n=1. This isn't
the same as O(1) where, regardless of the input size, the number of operations
will remain the same. This isn't just being pedantic, O(1) implies they have a
seperate algorithm that doesn't grow as input size grows, which is totally
seperate from intelligently running an algorithm whose runtime increases at a
constant rate. While in this case the resulting runtime will be the same, the
nuances that are implicitly implied when they say something is O(1) will not
be true.

~~~
dandelany
Surely that depends on what you call `n`, they are using `n` to refer to the
number of files in the project.

I would agree that the term is a bit of "shorthand" that doesn't perfectly map
to the idea of big-O complexity - for exactly the reason you mention, build
time still depends on the size of the file. But for me, it was shorthand that
helped me understand the idea. They had two other options on how to write this
- either leave out the big-O stuff entirely, or explain it more deeply eg.
"most bundlers are O(lmn) where l is the number of files, m is the number of
files changed, n is the number of JS tokens per file" or something. Both of
these options may have been more "technically correct" but would've taken me
longer to grok the idea than the way they have it written. Maybe they should
just have a footnote for the pedants explaining that O(1) is more of an idiom
than a technical claim in this case :P

~~~
chipperyman573
Yes, exactly. O(n) means that as n (the number of files in the projects)
grows, so does the runtime complexity. O(1) means that as n grows, the runtime
complexity remains constant. In this case, they're always using an input of
size n=1, but this doesn't make the algorithm itself O(1). By calling this an
O(1) operation, they imply that you could rebuild your entire project at the
same rate you can rebuild a project with just one file changed. This is
misleading and untrue, which is why it's not peedantic.

It wouldn't be a problem if it wasn't on a technical page like this where this
distinction can have large implications (such as the one above). When I read
that it was O(1) I drew conclusions that were both very favorable to snowpack
and also completly untrue. It'd be like if you said you drove a truck but you
actually drove an accord. It's probably fine to say that 99% of the time, but
it could cause issues if you say it to your mechanic because they'll conclude
things from what you said that might not be accurate.

~~~
msandford
If you have a project with 1000 files in it, how often are you editing all
1000 of them at the same time? Virtually never from my experience.

The way they've structured "rebuilds" is to only "build" (or really probably
do very little if anything) to just that one file you edited and saved.

Yes if you edit all 1000 files it's going to take longer.

"In theory there's no difference between theory and practice. In practice,
there is." I think this is one of those cases where in practice it's awfully
close to O(1) so while they're technically incorrect practically it's
difficult to tell the difference.

This esp. compared with something like Angular in which you're looking at
many, many seconds of build time to get started. I think it's laudable.

~~~
outworlder
> If you have a project with 1000 files in it, how often are you editing all
> 1000 of them at the same time? Virtually never from my experience.

For the purposes of Big-O notation, this does not matter. If they didn't need
its semantics, they should not have used the notation. Simple as that.

It's still O(n), regardless of the fact that they have optimized constants and
in the best case it may be less than that. Irrelevant. Big O establishes an
upper bound. Bubble sort is still bubble sort, we don't care if it is quick
when you are ordering two elements.

Maybe they meant to advertise Ω(1) on the best case, and compared to other
build systems?

EDIT: another poster says that "n" refers to the number of files in the
project. Still misleading. Usually we are interested in how the complexity
grows as the number of inputs grow. Big O purposely discards constants.

They could say that other build systems are O(n) where n is the total number
of files, while this one is O(n) where n is the number of modified files. It's
immediately clear then how this is better for the build use-case, while still
making it clear how the efficient it is as the input size grows.

~~~
spinningslate
> They could say that other build systems are O(n) where n is the total number
> of files, while this one is O(n) where n is the number of modified files.
> It's immediately clear then how this is better for the build use-case, while
> still making it clear how the efficient it is as the input size grows.

That's a great, concise and clear articulation. The project would do well to
quote you on that!

If anyone's still struggling with the difference between O(1) and O(n),
there's a common example with lists that might help:

1\. Getting the head of a list is usually O(1). It doesn't matter how long the
list is, getting the head element always takes the same amount of time.

2\. Getting the length of a list is usually O(n). The time taken to count the
list entries grows in proportion with the length of the list.

As an aside, note also that a list with a single entry doesn't make the length
function O(1).

------
elpool2
I started using Snowpack just last week, but I'm not even using the dev server
or the bundler part. All I really needed was its ability to convert npm
packages into single-file ES modules. Once everything is an ES module you can
just let the browser load them all, no bundler or dev server needed at all in
your dev cycle. The only dev-time conversion needed is the compilation from
typescript to JS, which my IDE already does instantly whenever I save.
Previously this worked fine for all our own code but not for dependencies, so
I'm pretty happy Snowpack was able to solve that problem.

~~~
gavinray
Snowpack's web_modules build step produces a single-file ESM bundle for each
NPM lib?

I wasn't aware of this, that's actually a pretty cool feature and incredibly
useful.

A bit unlearned on ESM modules, how are they different from the isomorphic
browser/Node single-file bundles produced by Webpack/Rollup?

~~~
elpool2
Yeah, that's basically what it does.

An ES module is just a js file that can be imported by other js files with the
"import X from 'module.js'" syntax. This is different from CommonJS modules
which use the "var x = require("module")" syntax. Modern browsers (Chrome and
Firefox) know how to load ES modules, so if all your js code is written as ES
modules you can just load your main.js file and then the browser will go fetch
all the other modules it depends on. It used to be that you needed a loader
like require.js to handle loading all the module dependencies, but that's no
longer the case.

But you run into an issue when you want to use 3rd party libraries that you've
installed using NPM. Most libraries still use the CommonJS syntax, because
that's what Node uses. Since they're not ES modules the browser can't fetch
them natively and you need to have either a loader like requirejs or a bundler
like Webpack. Snowpack will convert each NPM library into an ES module for
you. So you just run "Snowpack install" once, and then add "import X from
'/web_modules/module.js'" to your code, and you're set.

You still need to bundle before you ship your code, because not all browsers
speak ES module and there are still performance benefits from
bundling/minifying/etc. But having everything just load in the browser
natively when you're developing is quite nice.

~~~
gavinray
> _" Since they're not ES modules the browser can't fetch them natively and
> you need to have either a loader like requirejs or a bundler like Webpack.
> Snowpack will convert each NPM library into an ES module for you. So you
> just run "Snowpack install" once, and then add "import X from
> '/web_modules/module.js'" to your code, and you're set."_

Ahh understood!

------
renewiltord
Okay, this is really cool but I don't want to "create a snowpack app". I just
want a "If you're using webpack + babel and want more speed, do this" thing.
With the webpack dev server builds aren't _too_ bad for the size of thing I'm
working on.

~~~
cactus2093
That’s basically what the rest of the docs are for. I’ve been playing with it
recently, and there is a learning curve but probably less than learning
webpack from scratch.

I also found it useful to look through the code in create snowpack app, it’s
not very dynamic or complex, the config files are written in a simple way and
they get copied over or extended by the app that the tool creates for you.

------
k__
For everyone who was as confused as me:

It's basically a tool that allows you to develop without bundling, but it
still bundles for production via Parcel.

So it's not a Webpack/Parcel/Rollup killer.

~~~
jnwr
FYI, Snowpack uses rollup for its production bundling

~~~
keb_
Can you elaborate on that or provide a link to where you learned this?
According to there site, they only maintain two official plugins for
production builds (Webpack & Parcel). I'm coming from using Rollup, so I would
prefer to use Rollup instead.

[https://www.snowpack.dev/#snowpack-build](https://www.snowpack.dev/#snowpack-
build)

------
mgoetzke
I just tried it in a @microsoft/rush project of mine.

Added a new project with 1 dependency (which contains a single one-liner
function to return a test string). No other dependencies.

Takes about 30s to start. Not sure whether the fact that my dependency is a
link with many siblings due to rush and pnpm is an issue, but it is a far cry
from 50ms.

Also I did not get it to reliably pick up when the dependency has changed
(cache invalidation most likely has an incompatible strategy with `npm
link`/`pnpm`.

Snowpack in principle looks nice, but I think I need something else

------
flanbiscuit
I really don't get a sense of what snowpack exactly does from their website
but I found this blog post useful: [https://blog.logrocket.com/snowpack-vs-
webpack/](https://blog.logrocket.com/snowpack-vs-webpack/)

~~~
gavinray
Webpack/Parcel: Save file changes, app bundle gets regenerated to hot-reload,
this takes a bit of time.

Snowpack: Run a module build script one time (or again when adding new NPM
libraries) on project to generate some assets, but no re-bundling time between
changes.

~~~
mekster
> takes a bit of time.

Parcel's incremental build is < 100ms, so I'm not sure how SnowPack feels any
better for me.

------
0az
How does Snowpack compare to Rollup? I use Rollup because it's light-weight
and dependency-free.

~~~
k__
AFAICT It allows you to develop without bundling, but for production it still
bundles with Parcel.

------
mavsman
Shameless plug for those of you who prefer video tutorials to written
[https://youtu.be/nbwt3A9RzNw](https://youtu.be/nbwt3A9RzNw) It's an intro to
Snowpack v1 but it'll still give you a good idea of what Snowpack does and how
it differs from Webpack. I would agree that Snowpack isn't quite there for
production projects, mostly due to the fact that many projects still don't
ship their modiels as ES modules.

------
orra
I find this interesting. As a mainly desktop developer now doing web frontend
work, the JS ecosystem has been so frustrating.

Bundlers struck me as unnecessary given JS now has native module support, and
that is the premise of this project.

Some out of memory issues when bundling certain dependencies, and slow "npm
start" times with React, has only strengthened my initial impressions. So
again, this could be a welcome impovement.

~~~
atrilumen
You'll likely still want to bundle for production, though, for the
optimizations like minification, dead code elimination, module splitting, etc.

But yeah, JS is Crazy Town. It can be very frustrating.

( Be wary of dependencies. )

------
stefan_
I don't think they understand what O(1) even means.

~~~
adtac
Not just that:

>Some bundlers may even have O(n^2) complexity: as your project grows, your
dev environment gets exponentially slower

They seem to not understand the difference between exponential and quadratic
either. This is appalling.

~~~
phist_mcgee
Eh, so what, they're not using the correct mathematically term for complexity.
Do most developers care if it is quadratic or exponential? I don't care,
because they're both varying degrees of bad. It's just one gets worse faster
than the other.

~~~
tpxl
One gets bad at 10, the other at 10000. There's quite a difference.

------
julius
From other comments I understand Snowpack as:

Development: Creates many ESM-Files. Firefox/Chrome can load them.

Production: Bundles&Minimizes these ESM-Files.

One Question: There is a JS-Error, only occuring in IE11. "t._x is undefined".
How do I debug that?

~~~
chrismatheson
I would assume the flow to be the same for debugging post bundled Production
output from most tooling, use source maps and hope that the bundler produced
accurate ones :)

------
XCSme
Sounds interesting. It's a bit unclear for me what the "runs in 15ms" means. I
think in my projects, the TypeScript compilation is what takes the longest, so
although I use parcel and it's pretty fast, I still have to wait 1-2 seconds
for TypeScript to compile changes. If it does not bundle, and still uses all
the external transformers (TypeScript, Babel, etc.), what exactly does it do?
Does it somehow optimize the execution of those transformers/transpilers?

~~~
genuine_smiles
> I think in my projects, the TypeScript compilation is what takes the
> longest, so although I use parcel and it's pretty fast, I still have to wait
> 1-2 seconds for TypeScript to compile changes.

The build result doesn’t need to wait on the results of the type checking.
TypeScript or Babel transpiling can happen even if there is a type error.

> If it does not bundle, and still uses all the external transformers
> (TypeScript, Babel, etc.), what exactly does it do? Does it somehow optimize
> the execution of those transformers/transpilers?

It skips the bundling step, and does aggressive caching.

~~~
XCSme
> The build result doesn’t need to wait on the results of the type checking.
> TypeScript or Babel transpiling can happen even if there is a type error.

I do run TypeScript async with Parcel, but I still wait for it to finish
before I start working on a different task as I do want to know if I have any
TS errors before proceeding.

> It doesn’t optimize the transformers/transpilers; but it does only run them
> against the modules that have changed.

But isn't this how other bundlers work too? They cache results and only run
transformers on the changed files?

~~~
genuine_smiles
I tweaked my comment a little. I’m not sure exactly how webpack is doing it’s
work, but I think you’re right.

I think the big optimization is skipping the bundling. If you want to wait on
type checking results, and that’s the slowest part, then I don’t see how this
could speed up your builds.

------
nojvek
Having the browser make one request per npm bundle sounds awful. It’s great if
client has fast internet and server is close by, or mostly localhost, but
latency will play a far bigger role than the 50ms startup time. That’s not a
good metric to look at.

The metric that corresponds to user experience is cold compile + page reload
time, incremental compile + page reload time i.e. How long before I press
enter on a command and I see something usable in a browser to devloop on.

If you let the browser load the first file, parse and figure out the next file
to load, a large project could have 100s of roundtrips. That’s why JS bundlers
were created in first place. To avoid the cost of a long critical chain.

Using a device from Africa (Uganda) to connect to US servers, one feels how
bad an experience latency can make. More and more development is done on cloud
machines or remote host, so this isn’t a rare usecase.

What I do hope for is if there is a new bundler, it can use the webpack plugin
ecosystem. It’s massive and anything new has to foster a similar ecosystem of
tooling.

Or please just make webpack fast with incremental disk compiles. I would pay
money for that.

~~~
WorldMaker
HTTP/2 and HTTP/3 both go long ways to mitigating the costs of multiple
requests over single large bundled requests. It's still early days in HTTP/2
and HTTP/3 adoption, of course, but we're almost to the point where HTTP
itself takes care of many of the reasons bundling used to be needed.
(Especially as you get into more advanced features like Server Push.)

Also, several of the restrictions baked into the ESM module format are
specifically designed so that browsers don't need the full file to load, and
can use an optimized import parser that doesn't need to wait for the full JS
parser run to find the next modules to load. (I've seen benchmarks where
modern browsers have discovered/loaded the entire module graph before the HTML
parser has even finished building the DOM and signaled DOM Ready.)

That said, reading the site, Snowpack's focus on one ESM per npm bundle is
primarily just for the dev experience where you are on localhost and latency
isn't an issue. It takes several approaches to further bundling for Production
intended builds, including directly supporting webpack as an option (and thus
webpack's plugin ecosystem).

------
matthewhartmans
Congrats on V2 and everyone involved!

------
dang
Related from 4 months ago:
[https://news.ycombinator.com/item?id=21989967](https://news.ycombinator.com/item?id=21989967)

------
MatekCopatek
Is anyone using this in combination with a plain ole' server rendered app? All
the examples seem to build on a SPA example where you have a single index.js
entrypoint for your entire app. What about a Rails/Django project where each
page loads a few scripts it needs?

That usecase has been stuck with the "global jQuery plugins" approach for ages
and it feels like <script type="module"> \+ something like Snowpack would
really improve it.

~~~
Kovah
I actually tried this, and failed. Gave up after like three hours of trying to
wrap my own non-React scripts with that Snowpack stuff. It seems the tool is
not capable of handling those simple use cases, which is quite sad.

------
ecmascript
Is it just me, or is the build time pretty much never an issue? Usually when I
develop stuff builds/recompiles faster than I can switch to my browser to try
it out.

How is this such a big problem for people that people need to write yet
another build tool, instead of improving the one everyone already use?

~~~
cactus2093
I promise you this is a very real problem, every company I’ve worked at with
even a moderate sized codebase has had to battle webpack at various points and
try to hack in various types of only semi functional 3rd party caching tools
and such to make development more manageable.

If you’re a solo dev working on mostly new codebases I imagine it’s not a
problem for you though.

------
dreen
I guess Im different to most JS developers, because I prefer to work with HMR
off about 95% of the time. Its good for UI prototyping (which I dont do much
tbf), but it tends to get in my way when doing anything else. Maybe in total
it makes me loose a minute or two but thats not an issue.

~~~
Etheryte
This is interesting, what's the upside of working without HMR? There are
changes where HMR fails to figure things out and you have to hard reload, but
other than those it has served me very well otherwise. Interested in hearing
the other side of the story, if there is one.

~~~
dreen
Hard reload takes marginally longer, but has no potential to fail - its peace
of mind. I have had cases when small changes to dev tooling or maybe something
in state management causes a failure during HMR reload. And if you dont
realise this quickly enough, you might waste a lot more time than HMR saves
you. As I said its still quite useful for working just on UI, so its not all
bad, but ideally I prefer to work on UI components separately to the app
anyway, eg using something like Storybook.

------
koolba
In my experience using webpack, once you’ve configured incremental builds, the
only slow part is TypeScript type checking. That’s solved by doing it async
and having the dev build be compile only. Even a huge project builds after a
single file change faster than you can notice.

~~~
john_miller
Could you share a link or keyword about async type checking and 'compile
only'?

~~~
koolba
See the sibling comment.

------
simonebrunozzi
Tell me what it is exactly, before starting with a list of features, 50ms
start, etc.

------
it
I had been avoiding bundling due to its effect on development, but this looks
well worth a shot.

I do wonder though if it would be enough to turn on CloudFlare's minification
for prod.

------
iddan
Once create React app will use it by default it will be fun

------
pvg
Recently:
[https://news.ycombinator.com/item?id=21989967](https://news.ycombinator.com/item?id=21989967)

------
taneq
Thankyou for putting, nice and prominently at the top, what Snowpack actually
is! (“Snowpack 2.0: A build system for the modern web.“)

------
KaoruAoiShiho
Is Svelte really now a tier 1 library compelling enough to put in
advertisements like this?

~~~
tylerchilds
I'd say it's S Tier, but the casuals aren't on board with the pro meta.

------
sktrdie
If you're bit confused by what this is (as I was) here's a simple TLDR
conversation I had with them on Twitter [1]:

> Me: Would you say Snowpack is mainly about generating ESM files (and their
> common code) for each import statement? Curious how that is different from
> webpack's code splitting strategy perhaps together with an ESM plugin

> Snowpack: Snowpack's dependency installation is a form of bundling + code-
> spliting: your entire dependency tree is bundled together and then split
> into one-file-per top-level package.

In other words: they're a code-splitting strategy where they "don't touch your
code", they only look at it to find the dependencies and then they generate
files (ESM modules) from the dependencies information. Then they serve that
and let the (modern) browser do the rest.

Really simply idea but effective.

1\.
[https://twitter.com/lmatteis/status/1262126825427415044](https://twitter.com/lmatteis/status/1262126825427415044)

------
m00dy
What if a file depends on another file. So, I think it is O(n)

~~~
fwip
If one file is changed, one file is reprocessed, no matter how many other
files depend on it.

~~~
m00dy
your argument is not valid because a file can expose many functions in
javascript context. Another files depending on these functions need to be
rebuilt as well.

~~~
fwip
They do not.

------
PunksATawnyFill
Which is... ?

------
yesion
So Vite is already dead? Geez!

~~~
gavinray
Vite and Snowpack are a little different, though definitely similar in many
regards.

There's some good info here: [https://github.com/vitejs/vite#how-is-this-
different-from-sn...](https://github.com/vitejs/vite#how-is-this-different-
from-snowpack)

The salient points seem to be:

1\. "Vite is more opinionated and supports more opt-in features by default -
for example, features listed above like TypeScript transpilation, CSS import,
CSS modules and PostCSS support all work out of the box without the need for
configuration."

2\. "Both solutions can also bundle the app for production, but Vite uses
Rollup while Snowpack delegates it to Parcel/webpack. This isn't a significant
difference, but worth being aware of if you intend to customize the build."

~~~
yesion
Thanks, that's useful!

------
CyberDildonics
Would it have killed them to actually say what it is in the title of their
self promotion?

------
malandrew
I came here hoping this was related to figuring our avalanche conditions when
backcountry skiing.

~~~
gremlinsinc
Reddit is that way --------------------->

~~~
malandrew
Snow science is fascinating and absolutely par for the course for HN. Your
comment is far more HN worthy than mine.

