
Huge no. of files for Angular 2 - ishener
http://stackoverflow.com/questions/38718690/huge-no-of-files-for-angular-2
======
paradite
Interesting fact that I recently came across:

bower and many other npm packages, has some dependencies that eventually
depends on a package called "wordwrap".

And this "wordwrap" package somehow has its test folder exposed in npm.

The result:

Every single person using bower would have __one or more __copies of _In
Praise of Idleness_ by Bertrand Russell on your local machine, depending on
how many of your projects has a npm dependency of this package:

[https://github.com/substack/node-
wordwrap/blob/master/test/i...](https://github.com/substack/node-
wordwrap/blob/master/test/idleness.txt)

Don't believe me? Try searching for "In Praise of Idleness" in Spotlight.

Edit: Someone had already sent a PR about this on GitHub:
[https://github.com/substack/node-
wordwrap/pull/14](https://github.com/substack/node-wordwrap/pull/14)

~~~
zuck9
I used to `touch node_modules/.metadata_never_index` to prevent Spotlight from
wasting disk cycles by indexing that stupid folder. After searching "In Praise
of Idleness", it doesn't seem to prevent it. :/

Anyone knows how to prevent all node_modules folders from getting indexed by
Spotlight?

~~~
sotojuan
Just ignore your whole dev/code folder from Spotlight. It's not a good way to
get to it anyway.

~~~
douche
Is there an indexer in any operating system that's worth the aggravation?
Windows Explorer's is hot garbage, and I've never been real impressed with any
of the ones in my various Linux distributions.

Old-school grepping or using Notepad++'s find in files feature is far away the
best method I've come across, which is... kinda sad.

~~~
austinsharp
The indexing itself is great - I use Everything [1] heavily.

The client on the other hand? It's baaad.

[1] [http://www.voidtools.com/](http://www.voidtools.com/)

------
justinsaccount
This person is counting the node_modules directory. While JS is a bit insane
and this directory will have a ridiculous number of files, they are concerned:

"because my deployment (google app engine) allows only 10K files"

meaning, they don't realize that node_modules is for development and not
related to the application they would actually deploy.

~~~
sotojuan
Hope this comment stays at the top before all the "wow JS sucks!!!" people
arrive :-) Though to be fair a "modern" JS dev environment does use a ton of
stuff!

IIRC Angular 2 production builds are actually pretty efficient.

~~~
Svenskunganka
IIRC the Angular2 production builds are the largest among all the frameworks.
If you add the router it sits just shy of a meg - and that's just the
dependencies. Not sure if they've started working on bringing the size down
yet, but there are 1KB alternatives[1] for those who cares about their users.

[1] - [http://monkberry.js.org/](http://monkberry.js.org/)

~~~
vbernat
This website froze my Firefox running on an i7 CPU due to the animated
background.

------
agconti
Evaluating a framework by the number of files its dependencies are broken into
is a a pretty poor measure of quality.

~~~
huuu
Why? If framework X does roughly the same as framework Y which is 50% in size
then you can estimate that the code of Y is more effective.

~~~
fuzzy2
Instead of "50% in size" I'll assume "50% fewer dependencies", because I think
that's the point here.

I believe that creating a module without relying on other modules will likely
lead to reinventing the wheel. Well, lots of wheels.

However, that might still be fine. But what about that one corner case you
missed? It might already be solved in a third-party module that focuses on one
thing only.

It's really not that bad to try and use specialized modules as much as you
can. You can benefit from other people's cleverness and focus on more relevant
work.

Yes, there will probably be a lot of on-disk overhead. But is that really
relevant today?

~~~
Klathmon
This is the major part of the whole "left-pad" fiasco I don't get.

If there is a well written, well tested, and widely used micro-library out
there that does one thing and does it very well, why not use it?

Even if you think you can re-implement it in 5 minutes, will yours be as fast?
Will yours be as well tested? Will yours have an interface that many other
developers already know and use?

Sometimes reinventing the wheel is needed, but most of the time using a well
working wheel that someone else made is the best choice.

~~~
Xylakant
> If there is a well written, well tested, and widely used micro-library out
> there that does one thing and does it very well, why not use it?

Because every dependency comes with a cost. First of all, it needs to be
available and the author might decide to pull it - maybe not from npm, but
from github. Second is a matter of trust: Someone just needs to take over the
left-pad authors npm account an all of a sudden he can inject arbitrary code
into all projects using the dependency. I'd bet that 90% of folks don't even
bother to check the left-pad code. So basically you need to trust each and
every author of dependencies that they're benevolent and competent, that is:
They don't drop the ball, get hacked, loose access, ... And that task gets
harder and harder the more dependencies you have to vet. In a lot of instances
just inlining the code would be better. A larger stdlib that can be
selectively included would be better. It's a tough problem and npm just sits
on an extreme end of the scale.

~~~
Klathmon
I still maintain that those problems can be solved with better tooling and
package management rather than "bundling" dependencies.

Bundling to me is such a sledgehammer solution. Yeah, it can somewhat prevent
many of those issues, but it also comes at a pretty large cost.

* it leads to code duplication

* it can ruin the performance of tree-shaking and minification systems

* it prevents you from swapping out a small module with another globally

* it makes it harder to inspect and debug the code that you have installed in the node_modules directory

* it makes it harder to verify that the code on your machine is the same as the source code

* the bundler can introduce bugs into the source code

* The package now needs to maintain a build step and needs to maintain a separate source and "binary"

And more. Plus, in the end you might not even be helping anything. A big repo
like lodash can have just as many contributors as tons of little dependencies,
and big repos aren't immune to the org running it going belly up.

I guess I see those problems as more of a "large amount of code" problem
instead of a "large amount of dependencies" problem.

~~~
Xylakant
I wasn't talking about bundling but rather about something like C glibc or
rusts stdlib. Having a solid stdlib that covers for example string padding can
at the same time minimize code duplication and number of dependencies.

Neither did I deny that inlining everything comes at a cost as well, so the
goal is to find a good point on the scale. I was just pointing out that having
tons of small dependencies is not free of cost.

~~~
Klathmon
Fair enough, but with a language like Javascript a large standard library will
never be a reality.

There are too many implementations (by design) and the language is such a
"mutt" of designs that it will never happen.

I personally don't think that's a bad thing, but it is different to how many
languages work.

------
royka118
I did a bit of investigation into angular2 dependancies. It has alot more than
angular1 and react

[http://royka.github.io/2016/05/03/front-end-
deps.html](http://royka.github.io/2016/05/03/front-end-deps.html)

~~~
k__
It seems to me, that angular2 has a second system problem.

~~~
royka118
I think its generally a problem with npm tbh. So many hidden deps

------
keeganjw
I started working with Laravel not long ago and found my project folder had
24,000+ files in it. And those don't compile down before you deploy... it
makes me feel like I'm working on the tip of an unstable iceberg. Who the hell
knows what's going on down there. No one person could possibly hope to know
what it all actually does.

------
ishener
32,000 files for a hello world... jeez, i'm going back to java.....

~~~
j_jochem
Not a lot better:

    
    
      $ unzip -l /usr/java/jdk1.8.0_77/jre/lib/rt.jar | wc -l
      20138
    

It's just less of a burden on the host filesystem because those files are
usually loaded straight from the jar (i.e. zip file).

~~~
ivan_gammel
JRE is a platform, so if you counting platform files, why not also counting
NodeJS or browser sources too?

I'd rather look at comparable thing like JEE server + Spring + some server-
side renderer like Thymeleaf.

~~~
marpstar
I'd argue that Angular2 is a platform as well.

------
moderndeveloper
babel 6 with jsx transformer used to install a comparable number of files due
to module duplication. At one point it was a 100M install with some modules
being duplicated 45 times. Much of this was the fault of npm 2. But with
latest babel and npm 3 it's now a 35M install with 5700 files over 814
directories. I guess that's considered lean by modern standards.

~~~
maaaats
Just because of the way npm2 ordered the dependencies, the runtime of babel
got incredibly slow [1]. Npm3 fixed that drastically, but I wonder how much is
still wasted just because of navigating the file tree to the dependencies.

[1]:
[https://github.com/babel/babelify/issues/206](https://github.com/babel/babelify/issues/206)

~~~
STRML
The opposite is true - directory traversals themselves are effectively free,
this is not going to be something that slows down your app - but loading the
JS & creating the IR will be much slower. The 4 second startup time with npm2
& babel6 is almost certainly due to the duplicated dependencies, which means
literally hundreds of megs of JS have to be parsed & warmed up. With npm3, the
same files are (correctly) reused which significantly speeds up start time.

------
currywurst
Is there a "distribution" bundle convention for npm ? Analogous to static
linking, it would be one .js file that would bind all dependencies into a
bundle (e.g. gulp.dist.js) . In that case you would end up with a much smaller
number of dependency files to manage.

~~~
Klathmon
There is for your final output (meaning the stuff you would upload to the
server and serve to the user). but not for the development.

IMO it's a pretty big anti-pattern to do that. It just hides the problem of
managing dependencies (see, it's not 10,000 files, it's just one!), but
doesn't fix any of the issues associated with it.

Keeping each dependency small, and having tons of them means that
deduplication can work better, tree shaking works better, and it lets you do
things like swapping out one package for another with the same API.

~~~
currywurst
In this case, most of the files pulled down are the development dependencies
which are not going to be exposed to the production code.

I would be perfectly fine if npm pulled down "distribution" versions for tools
like gulp, typescript et al

~~~
Klathmon
It might just be me living in a bubble, but I'd much rather have the full
version downloaded to my machine in it's "raw" form than a "compiled" version.

Even just for the ability to dive into the source i'm using if i'm debugging
something, or be able to look at the actual code i'm running if I want to
understand how a tool works.

This is one of the reasons why I like how lodash handles their library. You
can install the "regular" version of lodash and require it like "normal", or
you can install a single big compiled lodash file, or you can install one that
exports as ES6 modules, or you can install a single function at a time...

Obviously every package can't afford to spend that much time on packaging, but
a framework similar to that along with some changes to NPM to allow tagging a
package as an "alias" of another (so lodash-as-one-big-file will be treated as
lodash for other packages) would go a long way into making everyone happy.

------
jbverschoor
Why isn't npm managing packages like ruby gems?

SHARED_DIR/npm_modules/NAME/VERSION

~~~
Klathmon
Because package-lookup and the package manager are 2 completely separate
systems in javascript.

When you `require` or `import` a file in node.js, it looks for a node_modules
and looks for that name in there. If it can't find it there, it starts walking
up the directory tree until it finds something it can use (to a point).

This is hardcoded and will be extremely difficult to change without a crazy
amount of breaking.

The package manager is free to install however, but it needs to put things
where the package-lookup can find them.

~~~
sah2ed
But it is still possible to have the best of both worlds.

Essentially, all they need to do is:

1\. leave the current behavior for backwards compatibility; then

2\. provide a flag like _npm -G_ that exposes the correct behavior as
suggested in the grand parent of using the same path like _SHARED_DIR
/node_modules/NAME/VERSION_ for package imports and package management.

With time, newer npm versions will default to the correct behavior. For folks
that need backwards compatibility, this would require explicitly setting a
_npm --compat_ flag or similar.

~~~
Klathmon
The problem isn't in the "package manager" it's in node.js

node loads modules in a given pattern. Changing that pattern would be global
to your project, and would cause issues with tons of 3rd party tools.

the best possible scenario would be to introduce a "new_node_modules" type
directory and change to the new system, then look in "new_node_modules" first,
then the legacy "node_modules" next, but that's a ton of work, a ton of 3rd
party tool breakage, and a lot of possibility for new bugs and breakage for
not all that much benefit.

That's not to say it shouldn't be done at some point, just that there are much
bigger areas that need to be addressed sooner in the node ecosystem.

~~~
moderndeveloper
node's module resolution will not likely ever be fixed. Too many modules
depend on its undocumented implementation details and there isn't the will to
improve it. A major source of problems is node's symlink resolution scheme
that depends on fully resolved paths, counter to how other UNIX programs use
files. Because many module developers know how the resolution scheme works
they often hard code behaviors and paths into their code that would basically
prevent any alternative module resolution scheme from working.

------
inaccessible
Reminds me of this talk that I just watched:
[https://www.youtube.com/watch?v=k56wra39lwA](https://www.youtube.com/watch?v=k56wra39lwA)

------
fernandopj
At first glance, I thought the title was "Huge NO: on files for Angular 2" \-
a vulnerability report on filesystem capabilities of Angular 2 and why it
should be abandoned

------
douche
Has npm finally figured out how to de-dupe dependencies? In one project, I
have something like 47 copies of the same version of the same library,
distributed at all levels of the node_modules hierarchy.

I try not to think about that JS tooling too hard, lest I start pulling my
hair out and devolve into a screaming crazy person.

~~~
Touche
It's actually not possible due to the way Node looks up dependencies. You
could have a situation like:

    
    
      node_modules/
        a/
          node_modules/
            c/ (1.0.0)
        b/
          node_modules/
            c/ (1.0.0)
        c/ (2.0.0)
    

Both a and b depend on c version 1.0.0, but since there's a version 2.0.0 in
the root node_modules folder c can't be placed there, and has to be duplicated
in a and b's own node_modules folder, otherwise Node couldn't find it for each
of them.

~~~
RossM
I've built a fairly hacky solution to this before (for a different package
manager) - it can be pretty simple:

    
    
        node_modules/
            versions/
                a@1.0.0
                    node_modules/
                        c -> ../../c@1.0.0
                b@1.0.0
                    node_modules/
                        c -> ../../c@1.0.0
                c@1.0.0/
                c@2.0.0/
            a -> versions/a@1.0.0
            b -> versions/b@1.0.0
            c -> versions/c@2.0.0

~~~
Touche
Some other people have linked to npm alternatives ied and pnpm which do
essentially this.

------
andrewclunn
Okay, but when you run the build process, how big is the resulting
distributable?

~~~
talmand
My results of the hello world tutorial in Angular2 was 53 requests, 4933.49KB,
and loaded in 1.74s on local dev, according to browser dev tools. All for one
html file that had one h1 element.

Plus, it started out broken. I had to search elsewhere to find the solution to
the error the tutorial produced.

~~~
aredherring
That's not a production build.

Minify it, run dead code elimination. Exactly as you would with any other
language with a compiler.

Also, worth noting that the tutorial being broken isn't symptomatic of JS,
that's a problem with Angular (which has a history of sucking, and IMO Angular
2 just takes all of the problems with Angular 1 and adds more baggage to it).

Be aware as well that Angular 2 is a full-fledged Web Framework. Even after
all of this compression and such, it is not going to be as lightweight as
you'd expect simply due to the nature of what you've installed.

If you want something really lightweight, go with Rivets or React.

~~~
talmand
I'm confused by your response. I haven't claimed anything along the lines of
your response nor am I implying anything about Angular nor JS. I simply just
communicated my results from completing the tutorial.

