Hacker News new | comments | show | ask | jobs | submit login
Parcel – A fast, zero configuration web application bundler (parceljs.org)
435 points by KeitIG 77 days ago | hide | past | web | favorite | 163 comments

Came here to puff about "yet another bundler".

Then read all of the docs in 15 minutes and realized that I might like this.

Then I used it and I know that I'll keep using it, unless there are any showstoppers.

Great job, Devon!

Browserify, Webpack, Gulp, Grunt: everything nice and shiny, but the one JS lib that stands out for me is Rollup.js[1]

Why? Because Rollup.js makes it possible to structure code into modules in a way that is backwards compatible and future proof. Yeah it takes some getting used to and it is not yet as simple as it could be, because integrating libs which are not yet available in the module format, takes some manual configuration, but once you know how to use it you won't want to got back.

I can highly recommend it, not because it is the hottest tool out there, but because I think it has a much higher long-term value than most JS libs.

Parcel seems to have a similar functionality but due to the wide range of other functions it comes with, I think there is a much higher chance that it will be rendered obsolete in the near future. Rollup.js on the other hand _focuses_ on the combination of modules to a single bundle which will continue to be required for the foreseeable future.

[1] https://rollupjs.org

Rollup is a good tool. You can get the same benefits in browserify using [rollupify](https://www.npmjs.com/package/rollupify) or using [tinyify](https://github.com/goto-bus-stop/tinyify) which can do tree shaking on commonjs as well as ES modules.

Webpack actually does the same (it's module bundler), browserify too (but in a more limited way). Gulp ant Grunt are not the module bundlers at all, but simply a task runners - absolutely different things. Btw there is also FuseBox module bundler, seems also with a focus on performance and configuration simplicity.

You are right, they are not exactly the same type of programms. Nevertheless, most of the time they are used for the very same thing: to transform your development source to format which is best to be delivered to the browser (bundling).

Actually, I still use Gulp to call Rollup.js as it is more flexible and allows me to use features which are not the scope of Rollup.js (e.g. automatic browser reload). My point is, that the exact thing Rollup.js focuses on is very important for the long-term development of JS: Modularization. Yeah, I know thats an upcoming part of the JS standard. But with Rollup.js you can use it today, as it has various adapters for legacy formats and therefore you have a tool which allows you a smooth transition into the future ;-)

Sure, but just don't limit yourself by the Rollup.js only. Again Webpack is a module bundler (modularization is there), and it was a module bundler before Rollup.js went public. So you can use all the ES6/7/TypeScript stuff (including modularization) using Webpack, Rollup, Fuse-box, and other bundlers, the difference is in the configurability, performance, configuration syntax, etc. Though yes Rollup provides some specifics, which might be more suited for the standalone libraries releasing, while Webpack would be better choice for the complex app development (that is it not supposed to be released as a library, but deployed somewhere as a complete app).

Webpack is anything but future proof when you start importing css or any non js resource. If you do that you are effectively stuck with webpack and it becomes a framework rather than a tool.

You do the same in rollup, what do you even mean? Importing assets is the whole point of bundlers, and they all import css as they should.

Rollup is great. Smaller bundle sizes and no mystery meat. It takes a little more time to configure but it's worth it in some cases.

Rollup never worked for me; all kinds of strange errors crop up when I run the bundle it generates

As a person who is not well-versed in front-end technology, it will be great if the author(s) could provide more comparison on the website against some other tools that serve the similar purpose in aspects other than performance. For instance the by comparing the scope the problem each solves, the ideology behind each project, or even a side by side configuration file.

The eternal cycle of developer tool bullshit:

1. "Ugh, OldTool™ v3.0.0 is overcomplicated, bloated, slow and unnecessarily configurable!"

2. "Introducing SuperTool™ v1.0.0, which just does the stuff you really need. No bloated configuration. No huge ecosystem of extensions. Simple, straightforward and fast."

3. "SuperTool™ v1.0.0 is great! But our setup really needs it to do something a bit different, so we've hacked in this extension."

...repeat for a while...

10. "Introducing SuperTool™ v2.0.0, which now has a more flexible API and configuration , allowing many former hacks to be done in a straightforward way."

11. "SuperTool™ v2.0.0 is great! But our setup really needs it to do something a bit different, so we've hacked in this extension."

...repeat for a while...

20. "Introducing SuperTool™ v3.0.0, with a whole new, flexible, pipeline based API. Everything is a plugin! There's even a plugin to send email!"

21. "Ugh, SuperTool™ v3.0.0 is overcomplicated, bloated, slow and unnecessarily configurable!"

22. "Introducing HyperTool™ v1.0.0, which just does the stuff you really need. No bloated configuration. No huge ecosystem of extensions. Simple, straightforward and fast."

The cycle described here is perfectly acceptable in principle, not to mention unavoidable in practice. Our developer tools will stop changing when humans stop changing.

Not really. Fragmentation is only inevitable when there exist feature tradeoffs. If Parcel's advantage is parallelism and caching, it's possible those features could have been added to Webpack.

Having one tool that is configurable for all problems often becomes a mess and even harder to use.

But the solution to this is to make more tools with different tradeoffs. Not to add features to the tool that was supposed to be simple at the beginning.

You're correct, of course. I consider "simplicity" to be a feature, which is almost always at odds with other features.

You have never used Tex....

God forbid we improve existing tools. What exactly is your problem here? Do you think we should all be using the Borland C++ compiler too?

I think the problem is less that there are new tools, but that the new tool comes promising to make the very much what the old tool did, just with much less configuration and smart defaults. But as soon as you start using that new tools for a while the once smart defaults are not so smart anymore (as requirements have slightly changed) and you start configuring it more and more and after a while you have the same mess of customization as with the old tool until someone comes up with a newer tool that has newer smart default and so on...

Its not necessarily bad as most new tools are slightly better than their predecessors, it just takes some getting use to and you have to learn how to structure your code base in a way that you do not have to completely refactor it every time a new build tool is adapted ;-)

Change != Improvement.

Does the release of parcel mean you can no longer use webpack, or gulp, or grunt or whatever you’re using now?

If you don’t like a new tool, don’t use it. I can’t stand this kind of meritless complaints. If you’ve got a problem with the implementation of a new tool, that’s understandable. But to take issue with its existence? That’s just petty and a waste of everyone’s time.

>Does the release of parcel mean you can no longer use webpack, or gulp, or grunt or whatever you’re using now?

If it takes away from their community and momentum, then yes.

Attention in FOSS is close to a zero-sum game.

Not to mention when the devs themselves get bored with tool X they've created and jump to creating some newer tool Y, dropping the whole community...

> Not to mention when the devs themselves get bored with tool X they've created and jump to creating some newer tool Y, dropping the whole community...

Given that you don’t pay for the product I don’t see anything wrong here. People work on what they want then change if they’re bored; you can’t do anything about that but pay them to continue supporting the tool you decided to use.

>Given that you don’t pay for the product I don’t see anything wrong here

Things might not be wrong and still be a problem. E.g. your neighbor practicing tuba (badly) during the whole day, in perfectly legal hours...

In fact someone can gift stuff with the best intentions, and still cause a problem.

> you can’t do anything about that

You can. You can encourage developers to contribute to a single project, rather than create fragmentation, which is exactly what's happening in this forum.

Webpack doesn't seem to have a shortage of contributions. What is your real concern?

1) A fragmentation of targets for complementary products. Any tool that must integrate with bundlers now has one more target.

2) A fragmentation of development efforts across competing projects. Webpack has many contributors, but they'd have 1 more if the author of Parcel built parallelism into Webpack instead of a new bundler.

Yes, you can. But you can’t force them if they don’t want it.

K. Who's forcing anyone?

Nobody’s forcing anyone but a lot of people including the parent comment complain when someone doesn’t want to work for free on the tool they use. Encourage people if you want, but don’t complain if they don’t listen.

By all means, do. People complain about things that make their lives worse, especially when it's avoidable, as it is in this case. There's nothing wrong with that.

You can encourage them to instead work on improvements to existing tools.

Oh, so if you don't pay for a tool you can't control the development of it? How unexpected!

First, nobody said it's "unexpected". We said that such a change is inconvenient for the user of the tool, which is trivially true -- whether they can/desire to control the development or not.

Second, even τηε premise of it being "unexpected" is not 100% true. None dev communities influence the development of projects all the times -- projects even have special voting boards for people to vote what features they want next etc.

Besides, it's very possible to not pay for a tool and still have a voice in the development of it. It just takes the development team to be willing to hear their userbase.

Last, but not least, direct payment is not the only/ultimate influence a user can exert on a project. Some FOSS projects are made by companies that want them to see wide adoption, so user complaints (of non coding users) matter very much -- things like Mongo for example.

> If it takes away from their community and momentum

In my experience, what is more common is that people who would otherwise not contribute to an existing project for whatever reasons (e.g. complexity, bad experience w/ maintainer, etc) start contributing to a new project because these younger projects are generally more approachable.

People don't really jump ship from something they've invested time into unless it's very clear that something else is taking over.

Agreed. It's more like Competition == Improvement

No, fragmentation can be destructive.

Can you give an example? In my experience that isn't really the case. For example, there are bazillions vdom implementations and they are only where they are now in terms of perf because of the intense competition.

This was also the case with jQuery/Sizzle and friends back in the day. There's js-joda/moment/date-fns, underscore/lodash/ramda/just, css modules/styled-component/styletron, etc etc etc.

Lack of competition tends to lead to stagnation, and since js has such a large (and growing) community, you're never really going to be at a situation where major libraries will run out of community support due to competition of equal caliber.


Read Coverity's account of building static analysis tools for C: https://cacm.acm.org/magazines/2010/2/69354-a-few-billion-li...

"The C language does not exist; neither does Java, C++, and C#. While a language may exist as an abstract idea, and even have a pile of paper (a standard) purporting to define it, a standard is not a compiler."

The immense fragmentation of C compilers makes it difficult to build static analyzers.

You're correct that the cost of fragmentation is seldom that there's a lack of resources around any single project. It's more that complementary products are much harder to build, because there's no standard for the product category.

Surely you can understand the downside of diffusing attention?

But also:

Change != !Improvement

So I'm not seeing the point here.

Change + Time == Improvement

I do miss Borland Turbo C++ v3.0...

In my imperfect memory, it was perfect.. (I couldn't go back though...)

turbo c 2.0 forever!

Welcome to developing in 2016-201X, were you spend 60% of your time configuring tooling, 35% developing test and 5% developing your app.

Sounds good to me. If I could spend 60 hours on developing the initial tooling, and if I could spend just 5 hours writing minimal code to get things working as I want, and spend the rest of the 35 hours on testing and making everything watertight, I'd call it a job well done.

It sounds awful, both for you and for the employer/ customer who pays you for only 5% of your time being actually spent in implementing features.

Features that break other features (or introduce vulnerabilities) because they haven't been through a basic testing are anti-features.

Sure. I don't write any tests (luckily I don't work on a business-critical system), and 10% of my time is dedicated to fixing features that were broken accidentally. And now? 90% of my time is still dedicated to developing new features.

Don't bother feeding the trolls, r00fus it's a lego developer, if their framework doesn't support they will rebuilt everything from the scratch or mark it was wont-fix, for devs like him is that we can have nice things.

Or stick to something that works for you and keep using the same tool over and over again.

No one is forcing you to use the latest bundler / tool / programming language. Let others experiment. You’ll be happy to reap the benefits of their work when something mature comes out of it.

> No one is forcing you to use the latest bundler / tool / programming language.

Tell me that when you get turned down in a job interview because you don't know <JS framework of the year>.

Try working on your positioning so that you are not hitched to the <JS framework of the year>. You will forever be competing with the low end of the market if you cannot market yourself.

Your advice basically tells them to get out of JS development because businesses continually hitch themselves to <JS framework of the year>.

This stuff comes up in interviews. Trying to avoid answering direct questions about a framework and answering with "Well I can learn it really quickly because I'm a good vanilla JS Dev" doesn't fly all the time. If you can't answer direct questions about a framework, they will pass on you if they get someone who can answer those questions.

It has nothing to do with marketing yourself.

What you described is a hiring problem. Marketing is knowing who your buyer is and finding creative ways to get their attention.

Found the MBA/project-lead guy. The moment you start working on management, that's the moment you stopped being a "developer".

I don't think that's the only interpretation. You can also work on moving into different parts of development. This problem exists everywhere, but is currently the most extreme in front-end web-app dev. Move the back end, ops, etc..

Isn't that to your benefit though?

There are tons of companies that are going to do certain things you aren't going to be a fan of. Isn't it better to find out they're not that into mature platforms and prefer the latest trend during the interview and not on the job?

> Or stick to something that works for you and keep using the same tool over and over again.

(in JS ecosystem)

1. It's really hard to find something that works

2. It's really hard to figure out all the dependencies, and configs, and plugins, and conventions, and ..., and ... for something that works

3. Once you have it figured out, each and every moving part, including the tool itself, will get updated with deprecations and breaking changes.

I agree in a lot of environments, but web/js build tools seem to have a high frequency of fragmenting and becoming abandoned before any real long-term maturity is established.

It sure looks that way if you follow the news, but my webpack config has barely changed for two years now and I'm still getting work done.

My gulp configs have barely changed in 4 years. But we have to convince people to work together and stop jumping ship rather than ignoring the process. I don't think Node has this figured out yet unfortunately. Too much hubris.

> No one is forcing you to use the latest bundler / tool / programming language

Except everyone is using them and you get quickly outdated, specially if you are looking for a job with heavy experience on python, php and asp.net web development. The market already shifted to "3 Years worth of experience on Parcel".


I used to love writing code and building stuff from scratch. Nowadays it's exactly what you said. I spend more time configuring stuff and finding tools that work well together than actually writing the underlying code that makes the app or the website work.

I don't agree at all. Building things from scratch has never been easier. The difference is, today you can build highly complex stuff "from scratch" as easy as simple things. Components are ready to use, javascript and node can steer robots, machines, drive mobile and desktop applications, etc. All of this is set up like nothing.

In the same time it costs you to boot up notepad.exe and create a HTML wireframe, create-react-app/electron/etc has already set up an application that is modular, knows dependencies and resources, is debuggable, can hot reload, etc. The time you would spend to get this manually ... it wouldn't even be fair to compare.

> In the same time it costs you to boot up notepad.exe and create a HTML wireframe, create-react-app/electron/etc has already set up an application that is modular, knows dependencies and resources, is debuggable, can hot reload, etc. The time you would spend to get this manually ... it wouldn't even be fair to compare.

Most of the times you don't need any of those things, you don't need modularization, you don't need a dependency manager, you don't need a debuggable version (because you are working on plain versions without minifies or module bundlers that compress your code), you don't need hot-reload (because you aren't using a transpiler/compiler-to-js), you don't need the hot-and-untested framework, sometimes you don't even need a source control.

The big difference between that and going from scratch is that you choose what you want based on your needs not because everyone or a framework tells you "this is the right path", current state of web development increases "lego developers" to rise and shine like if they were expert.

If you don't need it, what forces you to use any of it. Has the browser stopped accepting .js and .html files? It hasn't. Will we ever go back to where that was all that was needed? I sure hope that will never happen, because there was nothing glorious or great about that time. It's a Freudian slip when someone thinks back and goes "oh how easy it was." It just wasn't. It used to be the most limited, restricted, crippled, tough and volatile environment possible. Nothing meant manual labour as much as the old web. From chasing dependencies by hand, fixing issues across browsers, nothing could be re-used, it was plain horrible.

How about you don't do that, then? What is stopping you from developing things like you used to? Those older tools still work.

Go try to find a job that doesn't require any of the new tooling experience or look at the other extreme... looks for those asking 15-years proven experience of Cobol.

35% developing tests? What great institution do you work for?

They probably code using Barliman[1] ;-) </joke>

[1] https://github.com/webyrd/Barliman

90% developer tests here

Depends if they're talking about adding tests or battling tests/test frameworks to just get something out.

Oh man, this is my life. Spurious integration tests that are critical, but very complex and often times have timeouts in our test runner depending on how many builds are going in there.

I might take 1 day on the feature, and then 3 more days just getting it past the testing gauntlet

Slack /s

Cute that you think this started in 2016...

To be fair, i originally wanted to put 2010 but there are so many butthurt nodejs fans that will get easily offended.

Take it down another decade. Or two. There are versions of Old/Super/Hyper in libraries and frameworks for Java, Perl, Python... Once you get the Internet, the hype cycle keeps speeding up

Hey dude, it looks like you've been shadowbanned for years and I don't immediately see why. Just vouched for this one. Thanks for your comments hopefully it all works out this holiday.

Have you ever tried a tool like https://github.com/facebookincubator/create-react-app ? It makes your comment totally irrelevant.

Unfortunately the whole world isn't developing a react app. And the fact that create-react-app even exists is definitely proving the point!

Is not about React, it is about tools that makes our life easier. If you want you can stick to ES5, no frameworks, no tools but good luck scaling your app to work with a big teams.

Yea, as a user of CRNA (create react native app) with a moderately simple app, dependency hell is something we're already in. We don't see a choice, for reasons, but CNRA is definitely not somehow above these problems.

Node/JS in general has a pretty terrible track record for me with version hell.

Strong technical leadership and defining clear code boundaries is the secret sauce to scaling successfully. The shiniest tooling will not save your team.

IMHO, unless your company is also working on a tool like Bazel or Buck, you don't really have a team scalability problem.

True, but Old, Super, and Hyper together represent a deep and productively adversarial exploration of the problem space -- each trying to best its predecessor. The overall problem space benefits, even as each package individually ages into senescent complexity.

I use create-react-app and make sure I don't need to eject so (I guess) I use webpack under the hood, which I never had to learn. If CRA switches to parceljs and makes the build twice as fast, I'm all for it!

I normally share the same sentiment unless the tooling truly is much better than its predecessors. This looks pretty easy to use and fast. If it delivers on those two promises, it's worthy of further consideration by the community.

I share your frustration.

However, this tool actually do useful work, based on quick glance I made. So maybe don't discount it just yet.

But again, I share your frustration.

This sounds good to me. Encapsulate and simplify existing tooling based on the communities experience with it, then extend and add complexity as needed. As the development environment changes, we will constantly have to update our tools to make it simple for the basic 80% of cases.

AKA Evolution

I think it's super cool.

I've been configuring Webpack projects for almost 2 years now and I'd rather use Parcel tbh. My only problem is the lack of plugins such as loading .vue files.

Why would you use Parcel if it's not yet ready to provide all the needed functionality? There is also fuse-box bundler, I din't try it on complex projects though (and not going to in the near future), but in some terms it might be a rival of this Parcel.

+1 for Vue support

And the over complexity begins...

It would be nice though

With transitive dependencies, the total install footprint for this is 689 packages weighing in at 77MB.

This isn't reducing complexity, it's just taking all the junk you'd normally have in your bloated boilerplate and putting it directly into the build tool. I'm not convinced that's a good idea.

Mmmm, so... sub 100MB for a bundler + compiler + etc?

  - Xcode, for iOS/Mac Development, is usually around 5GB (and 
  no, even if you remove every single simulator, it's still over 100-200MB).

  - Android Studio is roughly the same deal as Xcode.  

  - I've not used Windows so I can't say for sure, but I'd be surprised if the various tech there doesn't follow this pattern.
77MB is nothing, and if you care about this, you're caring about the wrong things.

689 packages is enormous surface area for a leftpad-like (or worse) failure. Given that nobody is auditing those packages and many of the people who maintain them aren't adhering to best security practices[1], I don't think taking nearly 700 dependencies that you are going to treat like a black box is a safe or responsible choice.

Xcode and the Android SDK each come from a single vendor who provides an integrated development stack and presumably vets the contents. Does anybody do that with these npm packages?

It seems like every time I run the du command in my node_modules folder, I find some crap that shouldn't be there. Earlier this year, I submitted a PR that reduced the size of the babel react preset install footprint by about a third, just by removing a totally unused dependency that nobody noticed. I don't think the contents of npm packages get nearly enough scrutiny and it scares the hell out of me.

[1] https://www.bleepingcomputer.com/news/security/52-percent-of...

How are you using du to find things that shouldn't be in node_modules? Wouldn't you need to build up a dependency tree, based on your app's package.json and recursing into every module's peer dependencies?

Would you mind linking to your PR? Would love to take a look.

It literally just removes a line from the package.json file of one of the sub-dependencies: https://github.com/babel/babel/pull/5484

It's crazy to me that a package that is downloaded from npm over 4 million times a month gets so little scrutiny that nobody but me noticed a 5MB dependency on all of lodash that wasn't even being used.

Build tool. Not app code. How is security relevant here?

And you don't care if a malicious party compromises the development machine on which it runs? I can think of a whole lot of really damaging things that somebody could do running arbitrary JavaScript code with user-level privileges on thousands of developer workstations.

With various CI setups and some server-side rendering configurations, there are potentially scenarios where build tooling actually do run in environments where there are higher risks, though it's not as applicable in this specific case.

Not really about security, but I've seen some deps that even track their usage (aggressively). So, I totally agree with the "not having the same scrutiny" as other tooling methods.


One of the things I like about OSS and things like Stack Overflow is that in the old days, if I solved a hard problem, I solved it for a couple dozen people I worked with and maybe some folks at my next job. In some cases it made it hard to justify the work because I saved everybody 10 minutes a month and spent a week working on it. Which may be bad math for another reason: if I saved you the most aggravating ten minutes of your month, it may be worth many multiples of that ten minutes to you.

If I put the answer on the internet, then thousands of people benefit. If my work only save you five minutes one time, it still may be worth the bother in the grand scheme of things.

Which is a long way around to my point: this group is curating a bunch of build related modules which means I don't have to pay as much attention. That's not automatically a win (especially if they do a poor job of keeping up, which happens a lot in this space), but it has potential at least to be pretty useful.

See it as a silly but efficient cache.

Well, npm may implement some kind of content-addressable storage to transparently deduplicate the dependencies tree on disk. It's not quite easy to do well.

As long as your bundle size is small, and the build times are fast, it's a clear improvement, at the expense of spending like 0.15% of even the cheapest 60GB SSD.

Sure, its shifting complexity around, but its hard to argue that the API for this is more complex than the bloated boilerplate you're talking about. For some people the simplified API is worth the 77 extra MB.

I don't know about you, but that's where and what by i want tedious, error-prone work handled - out of my way and by the computer.

If you have some simpler, better alternative to html, css, and JavaScript for websites, more power to you. Over here in the real world, though, we've got stuff to do.

What kind of argument is this? This won't be part of your bundle — it won't increase the size of your prod build but can save your precious time.

The two coolest things about Parcel for me, are 1) multi-core support, and 2) a simpler, ES6-esque API. Without these two distinct elements, I think Parcel is nothing more than a simple Webpack boilerplate which could easily been generated by yeoman.

I see its potential regarding Parcel’s competitiveness against Webpack, but it also has worsen the JavaScript fatigue, which is considered a nightmare for many newcomers webdevs because of huge fragmentation and thus bigotry in the industry.

Is my career as a Webpack whisperer at an end?

cries tears of joy


Go, you're free now.

It would be nice if these bundlers could help publish npm packages that vend non-JS assets (html, stylesheets, fonts, images) in a way that's easily consumed by other packages.

For example, if one of the JS modules in a library package says "import classes from './styles.scss'" and an application package imports that JS module, the application should somehow end up with the CSS output from styles.scss without having to configure Sass itself, and without having to include the entire stylesheet from the library package.

Actually, you can do that with rollup.js and riot.js. Components written with riot.js contain all their HTMl, CSS and JS (preprocessors can be used too). Rollup.js can import those components like normal js modules and at runtime riot.js injects the HTMl and CSS.

The only downside is that you either deliver one large bundle with mixed HTML/CSS/JS content to the browser or have to use some complicated configuration the split at least the CSS to a seperate file.

It's already a common approach to pack CSS and some other stuff into the single JS bundle, so it's feasibly and many devs already do that. It should be possible to do with any modern bundler, including Webpack of course.

I've been using http://brunch.io and enjoying it. Far less configuration than Webpack.

I love brunch but I can't find a community for it, what do you do when you need support on a plugin or help understanding something?

I have a few github issues open that haven't been touched in months!


100% agree that support is a major problem. I had to figure it all out for myself, through trial and error.

> I've been using http://brunch.io and enjoying it. Far less configuration than Webpack.

I used brunch with a rails app. Breakfast gem. Not as fun to use as it seems, since doing anything out of the norm is almost impossible to find documentation on.

Honestly, Webpack was easier to use, simply because it is easier to find information on the web about it.

Definitely overlooked gem. Phoenix is using it though (Elixir web framework)

Brunch is so 2011 bro /s

I had my eye on http://fuse-box.org/ for a while, and tried a few sample projects with it. It has a lot of the same promises as Parcel appears to have. Multicore is exciting for me, which Fusebox doesn't have. So... many... choices...

It would be great if the author taken any of the HNPWA project and replaced WebPack with Parcel so you can compare the setup and actual configuration needed. WebPack is currently a standard and the community is great so I'm not seeing this replacing it in any way.

I have no problem with ditching webpack for another tool if it's faster and of similar or better quality. Is anyone using this in production and want to share their experiences?

Should main.css be included in the index.html example?

This just confused me for some reason but it's more me focusing on the wrong thing, haha

It's injected into the main.js file which is added via index.js to the page

From the webpack team to Devon:

Thank you so much for this work! Not only are we happy to see a real contender in the bundling space, but also look forward to not only work together, but also learn from parcel in ways we can help make webpack more flexible and easy to use for those who want "zero configuration".

Welcome to the bundler family!

Sean webpack team

Really happy that besides being fast they made a priority to keep the configuration simple.

I love webpack to death but there are just so many parts that I keep getting confused, esp with things like "!" in in require, always copy-pasting the same boilerplate code over and over to for sass, es6, etc. I know it may be more flexible but sometimes you just want things to work and can't be bothered with too many details. Laravel did this with elixir where it wrote a wrapper just to do it (which lets you just get on with your work without having to learn another tool). This too seems to be focused on that. Great job!

Didn't see any info that would give me an impression of the possibility if this getting any traction which I think would be even more important than any specific benefits.

This looks awesome!

After reading the docs though I couldn't find any mention of tree-shaking or other things that would shrink the final bundle size. Any word on that front?

This is what I was wondering as well.

Especially since Webpack's tree-shaking doesn't _really_ work for a lot of third party libraries:


I imagine Rollup might do a bit better on that front since its the bundler that popularized tree-shaking in the first place. Can anyone with experience with Rollup comment on this?

Looks like a well-thought-out approach to a clientside JS bundler.

Is this officially backed by one of the author's employers or any other organization? I've seen too many projects (including major parts of bundlers browserify, gulp) die early when a sole author is forced to move on. Beyond the technical advantages, if I knew a company had already invested resources it would give me a solid business reason to switch.

I am not the target audience for this.

However, my first reaction was "yet another build tool, huh?"

My next thought was "...I must be getting old."

Wait for another 6 months, and we may have "yet another build tool" again

I'm the target audience and thought the exact same thing.

You might be getting old, but you are still of sound mind :)

Is anybody running a project with webpack that has a 20s rebuild time?

Or if somebody has switched from webpack to parcel, can they give an idea of the performance improvement?

We've got a moderate sized app, and webpack rebuild (typescript, server-side react, client side js bundle, css export) is less than 3 seconds.

We have a React app with roughly 100k LOC and a lot of SASS that takes about 4 minutes now to create the bundle on my MacBook Pro 2017.

That's probably the initial build, though, right? If you've already got dev-server running and it has to rebuild, it shouldn't have to rebuild the entire bundle again.

Also every time you add/remove a file or make similar breaking changes.

Yes, that is just the time for the initial (or production) build. The dev server is not as fast as it has been initially, but it is fast enough.

Is that less than 3 seconds a production bundle (npm run build)?

> Bundle all your assets

Can someone ELI5 what is bundling and what problem does it solve?

Sure. Programmers usually group logically-related functions/classes/components into smaller files.

The logic for web "applications" these days can be complicated and involve loading hundreds of such files. If loaded individually, this requires a lot of HTTP requests and results in a slow page load time.

Bundling is similar to a static linker in that it produces a single binary out of all its dependencies. This can be loaded with a single HTTP request, cached, zipped, whatever.

how does this handle existing 3rd party jquery plugins? The single biggest problem and configuration nightmare I always face is bundling something like CKeditor.

So far from comments I have collected following bundling tools:

- Webpack

- Rollup

- Fuse-Box

- Brunch

- Browserify

Which bundler does this use?

It is a bundler...

Explain to me why do we web app need bundlers?

I long for the day I can use browser native import to retrieve all my modules and deps, but at this point it doesn't resolve node_modules and that makes it unusable to me and the reason I am using a bundler.

2 reasons:

1. Browsers don't support JS modules 2. It's more efficient for browsers to request a single file than a bunch of smaller modules. So even if you use something like browserify, which dynamically loads modules in the browser, you end up with much longer load times because of all the overhead with each HTTP request.

why are we not just using bash scripts ???

-you can log output -perfectly capable of handling your unique build style -no setup (pretty much) -just call the command lines of other tools like (less.c, yui-compressor.jar ... etc) for compliation tasks

talk about over engineering. The thing is that the reason we dont have 1 to rule them all now is because no one has ever objectively argued why one is better than the other.

If we could just get command line equilvalents for things like ember-cli that didnt require npm then we would be good. Npm is the thing holing us back , we dont need it to do command line processing. Its a bloated middle man. Please command line tool gods , offer your tool outside of the javascript eco system. Build it in java and export to jar or c and allow us to worry about the stitching

Question , what is the thing npm/grunt/gulp gives us that cannot be done with bash, if all the tools we used were command line executables?

I think you may be confusing npm with something else? npm is a package manager, and it does run as a command line tool. Similar to how apt-get is the command-line frontend for a package ecosystem. npm also includes npx, which allows you to directly run any command line utility in the npm ecosystem without installing the package first.

Try running `npx ember-cli` directly in your terminal.

all you say is correct but im not confused. node and npm is require to run many command line tools (grunt, gulp ...etc). I think this is an unnecessary overhead if we could get these tools to just be built as standalone command line tools and then we can just use bash. Im asking for a particular feature that comes from this environment that I dont have in bash? Why do I need this environment.

I have a sense that webpack is the same, its a bloated environment. I dont need an all in one solution for a build process. Just give me the tools, ill use bash and call it a day. So Im just asking, is there something Im missing? What is the feature from these environments that I can directly preform by calling a script from bash? Why cant we just pull that script out into a command line tool and then devs can use it to their liking, pipe its output to a folder, cat that file to soemthing else, call another tool to minify the results, move all the contents to the dist folder.

Node just doesnt seem to give me any more power as a dev (build wise), why is it so important to the space.

Hats off to frontend engineers, the space has evolved to something respectable but I feel like over engineering is a pursuit of community. Its like they keep making new tools, to encapsulate more features and then post on HN and say "Look at this cool new way to do things that fixes that problem you really didnt have".

No ones webpack was bottlenecking their work, im guessing you can choose what to build and when so you dont have to do a full build every time. No one is going to choose a build system because of its blazing speed. I would rather not have people work on these problems that dont really need to be solved.

Node is a runtime, npm is a package manager. Together they let you download code (js or otherwise) and run it.

This is the same as Java (JVM), .NET (CLR), and plenty of other languages. You can compile to a native assembly if you like (similar to C++ or Go) but it's not necessary when you have fast runtimes and JIT compilation.

Everything you listed for functionality is possible today - while also letting people write and ship code to multiple platforms quickly.

> npm is require to run many command line tools (grunt, gulp ...etc)

You seem to be looking for the ability to run JS tooling (builders, transpilers, minifiers) as common command line utilities. That seems to be entirely possible and I'm not sure why you're making the distinction. I can pipe input between all of them, include them in bash scripts, make files, whatever else you're used to.

There's no meaningful distinction between a "standalone command line tool" and a command line tool that is written in node. If I added webpack (for example) to an ubuntu package repository, you could do `apt-get install webpack`, and have a "standalone" command line utility without ever downloading it from npm. It would have to pull other ubuntu packages as dependencies, but that's standard practice for installing something these days.

Unless you're wondering about packaging it into linux. In which case I'm just as confused as you are.

> Why cant we just pull that script out into a command line tool and then devs can use it to their liking, pipe its output to a folder, cat that file to soemthing else, call another tool to minify the results, move all the contents to the dist folder.

I guess because everyone needs to do those steps, so why not have a single command that does it for you. Sure, you could write your own and have everyone maintain thousands of different random shell scripts all eventually doing the same thing, but why?

You can totally do all of this. There are projects that just pipe the output of Babel into uglifyjs into a file and call it a day. You still “need” to use npm to install those packages though. npm puts command line utilities in node_modules/.bin to not pollute your machine. npm just stores packages as tar.gz files though, so if you wanted to you could wget them yourself.

React and node bundling and compilation ?

When one deploys something like this in production, how do you interact with an internal API? Surrendering control of my url routing always causes problems for me.

Applications are open for YC Summer 2018

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact