
The other kind of JavaScript fatigue - moneymakersucks
http://chrismm.com/blog/the-other-kind-of-javascript-fatigue/?123
======
vivin
Realized this sometime ago, when I was wondering why the Java ecosystem
doesn't have this problem. It boils down to the barrier of entry. On JS it's
so low as to be almost nonexistent, which is why you end up with junk like
isPositiveInteger. Slap some shit together and ship it.

You can't do that easily in Java. You need to spend some time understanding
the language, the ecosystem, how to bundle your code into an artifact, and
then how to release and publish it; the barrier of entry is much higher
because there is a lot more to learn and understand.

The article mentions Android - I haven't worked much with Android and so I'm
not familiar with the libraries there. Is there the same sort of problem in
the Android ecosystem that you see in Node? I wonder if he is talking about
the fragmentation of Android _implementations_ (flavors from different
providers), which is a different thing entirely. Seeing as Android uses Java,
I'm thinking the ecosystem is not like Node.

~~~
cm2187
I think it's rather that the java and .net framework are pretty good out of
the box. So there is only a need for additional framework for certain corner
cases. And then some get bought by Microsoft and integrated out of the box,
like the datavisualization libraries or xamarin.

JavaScript on the other side is a language that only its conceptor could love.
All these frameworks are sort of required to be barely productive. Which is
why you end up with these big frameworks to abstract the various browsers,
introduce databinding, or add static typing.

~~~
douche
Javascript has a terrible embarrassment of a standard library. Things that
should just be baked into the language are not, which has lead to attempts to
rectify that shortcoming, like jQuery, Underscore and LoDash. Because this
standard library isn't implemented in the browser itself, where it belongs,
people have gotten upset about downloading a few hundred kbs of a general
purpose utility library, and instead reimplemented different bits of it in a
thousand and one ways, nearly all of them more or less subtly broken. (I don't
understand the obsession with JS library size - it's completely irrelevant in
95% of cases, compared to the megabytes of other garbage most sites will pull
in for trackers and images).

~~~
cm2187
And not just in javascript. HTML is largely the culprit too. A lot is done in
javascript that HTML should really handle.

For instance why can't we just add a URL attribute to an input to provide
autocomplete and validation. It is such a common scenario there shouldn't be a
need for a javascript framework to provide that.

Same thing with responsive design.

But these technologies are stuck in the 90s and barely evolve anymore.

~~~
krapp
>For instance why can't we just add a URL attribute to an input to provide
autocomplete and validation.

You can. <input type="url" /> is part of HTML5[0].

Of course, support may not be universal[1], which is why you will probably
always have to use javascript if cross-compatibility matters.

[0][https://www.w3.org/TR/html-
markup/input.url.html](https://www.w3.org/TR/html-markup/input.url.html)

[1][http://caniuse.com/#feat=input-email-tel-
url](http://caniuse.com/#feat=input-email-tel-url)

~~~
cm2187
No by URL I meant the URL of the service that would provide the autocomplete
or the validation. Like:

    
    
      <input type="text" validation="url1" autocomplete="url2" ...

~~~
telotortium
I'm not sure how either of these would help:

\- validation: Fundamentally, to be able to trust the input, the server needs
to validate it in its final state. The only reason validation should be done
on the client is to reduce the latency from the time the user inputs the data
to the time the validation result is reported, so the user can fix their input
faster if it fails validation. Sending it to a URL before form submit doesn't
make sense when the input will have to be validated after form submit anyway.

\- autocomplete: It's unlikely that implementing this in native code will be
faster than in Javascript. It's not significantly faster to make an HTTP
request in native code -- native code is more advantageous for heavy
computation, graphics, and multithreading. And yet, it's easy enough to make
HTTP requests from Javascript that an autocomplete attribute wouldn't add much
value.

~~~
cm2187
Being faster is not the point. Having to rely less on javascript libraries and
more on the built in framework in the point.

And client side validation is not a substitute for server side validation,
merely a better UX.

------
jaequery
"Embrace change, and it will make you a better developer."

I don't agree with this at all. I think a good coder has complete mastery of
their code and tools which is only possible by putting significant time to use
one thing instead of jumping one thing to the next ever so often.

Imo the best way to become a better developer is not by using what others have
created but trying to create libraries and frameworks yourselves to truly
understand or at the very least think about the in and outs of the code to
think throughly and produce a better architecture. It will open up your eye to
differentiate good and bad code rather quickly.

~~~
kowdermeister
And you are wasting huge amounts of time by rewriting stuff that's already out
there tested by thousands of people.

Also collaborating with others becomes much more challenging because you
invendted everything.

With experience I can minimize self written code and I can skip plugins and
libraries, but I feel better if I can rely on something solid.

It does not apply to all field of programming per se, but web developent is a
perfect example.

~~~
SixSigma
"Trying to build a framework to understand frameworks" is not the same as "use
you own frameworks on your professional projects"

I have written my own Forth, my own filesystem, my own Mvc etc. etc. and I
would say they have improved my skill more than learning a new Algol derived
language but I would never advocate their use in production.

~~~
kowdermeister
For learning it's a good idea to make something from scratch, way better than
writing todo apps :) It can even be better than the original one. For
productivity it's not the case however.

------
cyberpanther
This is nothing new, JS just evolved the problem to the next level. Perl and
CPAN had notoriously bad modules and each subsequent language has made
packaging easier. And now JS is available with easy packaging or just by
including a URL to the package. This problem is inevitable. JS just is the
current most widely distributed and used language.

I think we have to learn live with crap code and find ways to surface the
diamonds. A rating system for packages would be great. Maybe based on
developer ratings, amount of contributors/community interaction, and number of
unresolved issues?

~~~
vivin
I guess my experience with CPAN has been different. Probably the most
frustrating thing is the paucity of documentation from certain perl modules.
But the modules I've found and used have been pretty good quality and pretty
stable.

Then again, I've never built anything "enterprisey" in Perl (just little tools
and scripts here and there), so that could it be it.

~~~
cyberpanther
I think you might have hit something here. Because Perl is definitely hard to
read and with bare documentation, the module might have been fine but we just
never knew how to use it.

~~~
felixgallo
Perl can be hard to read, but it has incredible documentation and CPAN set
(and apparently still keeps) the diamond standard for language packages and
distribution systems.

------
n0us
One factor that leads to fragmentation is this "theme" that I seem to see over
and over that when a project is popular the creator gets to be the github
maintainer and take credit for it's success as a large and popular project,
which by itself is a good thing.

The bad part is that when the maintainers make unpopular decisions like
disabling all functionality by default and relying on submodules, or not
wanting to fix some obviously horrible bug because it's a 'feature', or
abandoning the project for months at a time, often there is a cop out
mentality that goes along the lines of "well it's my project and I didn't
guarantee anything when you decided to use it so why don't you go make your
own." There by taking credit for the project's successes but not
responsibility for it's failures. So then people do go and write their own
implementations and we end up with half a dozen half baked libraries.

An Example: Underscore and lodash are both very good, so are not examples of
half-baked, but do we really need both of them? People will just say "well you
only have to use one in your app... yada yada yada" but the problem is that if
I want to rely on any other npm packages in my app, are they using lodash or
underscore? I'll probably just end up with both of them shipping in the bundle
because for whatever reason there couldn't just be one popular utility
library, there had to be two that do basically the same thing. People will
respond to this and say "well why don't you fork those modules that rely on
underscore and make them use lodash?" My answer is no. I don't want to fork
stuff. I just want to be able to find modules that I can rely on with a
reasonable expectation of quality and maintenance. I'll even help maintain if
it doesn't seem completely futile.

There may be some advantages of lodash over underscore in comparison to one
another but those advantages are minor in comparison to what would be gained
by having a single utility library that everyone is on board with.

Don't even get me started with routing libraries for React.

~~~
Bahamut
Most open source maintainers are developers who work full time - I am a
maintainer of a major project, but there are times my activity is light, such
as the past two months. In that time, I travelled to Salt Lake City, Reno,
Portland, San Diego, Seattle, and now in route to Chicago - I ran a half and
full marathon in that time, and about to run another half tomorrow morning. I
am currently in crunch time at my current job for the past two weeks, and I am
about to leave for a well earned 2 1/2 week vacation to England, France,
Switzerland, and Italy. I don't plan on coding much, if at all, while in
Europe.

I very much enjoy working on open source, and have implemented some tricky but
highly useful features over the past almost 1 1/2 years of my stewardship that
users greatly appreciate. However, I also have a rich life outside of
development as well, and I believe that each persons' choices with how they
choose to use their time should be respected. If you are not paying or
contributing your own time feature developing or assisting maintainance,
complaining about maintainer absence is really poor. We are not on demand tech
support.

~~~
StevePerkins
In the Java world, most major open source projects are sponsored by companies.
Someone(s) is getting paid to maintain and evolve that thing as part of their
_job_. Companies might do this to drive traffic toward the commercial
"enterprise" version, for which they charge money. More commonly though, it's
just something that they use internally... and they open it up for
marketing/prestige/recruitment purposes.

In most other language ecosystems, most open source projects tend to be driven
by individuals as unpaid side projects. That's great in a certain sense, and a
large part of the reason why Java is less "cool" among young people who are
eager to plant their own flag on an open source thing. But sadly, it's just
really difficult to keep a major side project alive and healthy over the long-
term without sponsorship. So those ecosystems tend to be chaotic and flaky.

~~~
n0us
Couldn't have said it better. In the webdev world this turns into a problem
because I need to rely on these oss projects and while some of them might be
high quality, there is a significant overhead involved in determining which
ones are worth using and which ones I can rely on. For personal projects it's
a not big deal but for work I want to be as efficient and reliable as possible
and the chaotic/flaky ecosystem gets in the way.

edit: fixed to *not a big deal

------
appleflaxen
> Remember Google’s Polymer? Angular 1? Express? Perhaps the organizations and
> individuals which cause these abrupt termination events should carry a
> stigma.

Why is Polymer on this list?

~~~
havefunwiththat
Same question. As far as I know, it's a pretty active project. Did the author
mean the jump from v. 0.5 to v. 1.0? That had some breaking changes, but that
was expected.

------
thesmart
I think the convenience of GitHub and NPM are good features, and it's easy to
blame ease-of-use as a culprit. I have often pondered about how remarkably
poor these tools are at representing repo quality. I think there are some
basic questions to ask before accepting any unknown source in as a module.
What is the core problem and requirements the source addresses? Does it verify
the solution and how? What is the reputation of the core contributors? Is it
used by any serious institutions? (Aside: download count and star count are a
measure of herd mentality) What are the open issues? What are the closed
"won't fix" issues? How often are issues regressed? What is a safe version?
How is the versioning managed and how should I pin it? How many dependencies
do I assume from this library? How severable are the dependencies?

It would be useful if package management tools facilitated the process of
understanding the liability imposed by a package, but the opposite is
encouraged. Authors too often put up shiny marketing materials and make bold
statements about the utility and vision of their software. It would be
refreshing if authors were as open about the flaws, trade-offs, alternatives,
etc. but few are.

Perhaps some kind of rating or feedback system that is qualitative in nature
would help mitigate the salesmanship? I'd love to know who has been burned by
a project and anecdotes about how packages are used by others. Ratings around
issue resolutions would also be helpful, how often have we all had major bugs
dismissively closed by maintainers?

TL;DR there are ways to develop a comprehensive assessment of a repo, but our
tools are lacking and need much improvement in this regard.

~~~
extrapickles
I would like this idea extended to versioning, where you get a simple release
number and a risk number. Even for projects following simver, they all have a
differing idea of how big a change is and simver doesn't cover bugfixes that
make breaking API changes.

If a new version simply incremented the release number, and estimated the risk
of issues from the previous version, it would be easier to tell if you wanted
to update. Also after a project has a few releases under its belt, you can
normalize the risk to other projects risk level (eg: project x always
underestimates).

For open source projects, it enables tooling to look at the functions touched
and the functions used by your code, and add risk to the update.

An example:

You are 3 versions behind, updates are versions 15/42 (minor bugfix), 16/500
(new minor feature) and 17/32000(major api change). Risk of updating is
42+500+32000. If you peg the amount of risk for a automatic update to 1000,
then you would only get the first two.

The same thing in simver: 1.0.1 1.1.0 2.0.0

While its somewhat easy to gate simver, it doesnt lend itself to automated
risk assesment as it would be much harder for a tool to tell the difference
between a bug fix and new feature (too many bug trackers out there).

------
john2x
<rant>

I blame Github, in particular its "stars" feature. It makes putting out code a
popularity contest.

"Why should I make someone else's project more popular? I'd rather spend my
free time making myself popular."

</rant>

~~~
krapp
Github seems to want to be a social media site first and a git host second.
Which is why I exclusively use Bitbucket now.

~~~
kowdermeister
Why is socializing with other programmers a bad thing? They manage the core
repo hosting quite well. They also act like a solid CDN.

~~~
krapp
>Why is socializing with other programmers a bad thing?

It's not - socializing with programmers is why I'm on Hacker News. But,
socializing isn't what Github should be about. Github should be about
development, first and foremost. Having issues and comments support pointless
features like emoji and voting and images (and image memes) dilutes the focus
of the site and moves it towards being just another forum.

Also, having Github act as a social network injects trolling, politics and
drama into the development environment, and I believe development should be
apolitical and acultural. I dread the (extremely unlikely) prospect of
publishing a project to Github that actually becomes popular and accepting a
PR from someone whose political views I may personally despise, but whose code
is acceptable, or the backlash if I don't abide by a particular Code of
Conduct. Not everyone wants to deal with the cesspool that is modern "social
coding" or to build a "community" or to have the number of stars on their
account validate their resume.

I just want a place to host code and maybe take pull requests and not have to
pay for the privilege of having a private repo.

------
EugeneOZ
"JavaScript programmer" term is similar to "vegan" \- nobody cares if they
only use one programming language for all tasks, but they constantly tell
others about that.

What this article is about? Snippets from SO answers are not as good as
battle-tested code? It's obvious and it's same in all languages. You can't
find library for any task in tracker you have? Same thing.

~~~
vivin
> What this article is about? Snippets from SO answers are not as good as
> battle-tested code? It's obvious and it's same in all languages.

It's much harder to ship SO copypasta in say, Java or C#, than in Node.
JavaScript is a much easier language to grasp and it is also a lot more
forgiving and loose. Combine that with NPM and can suddenly be immensely
"productive". But the ability to write and push a lot of code very fast also
means that it makes it very easy to write and push a lot of _bad_ code very
fast.

~~~
EugeneOZ
It only depends on programmer. Somebody copy-paste from SO to production,
somebody write tests and run them before deployment. There is a lot of dynamic
("forgiving") languages, so JS is not unique here.

~~~
Illniyar
I've seen people change Java code in production to fix a bug and recompile it.

This is hardly a behavior that exists only in dynamic languages.

~~~
vivin
That's a different issue entirely. The article is talking about the quality of
frameworks and libraries in JS.

~~~
Illniyar
Perhaps, but I was replying to the parent comment, which did talk about
dynamic languages and copy paste code to production.

------
lwf
> Richard Stallman envisioned open source

No, he envisioned Free Software.

[https://www.gnu.org/philosophy/open-source-misses-the-
point....](https://www.gnu.org/philosophy/open-source-misses-the-
point.en.html)

------
spacehunt
> The fragmented nature of large open source ecosystems has become evident
> since the advent of Android.

Is the author talking about the fragmentation of libraries and tools for
Android, or the general "Android fragmentation"? If it's the former then I
haven't experienced it, certainly not to the extent of the Node ecosystem.
Yet...

> [...] because languages with smaller communities such as Go [...] don’t yet
> suffer from it

... I disagree, take for example the tons of different projects trying to
"solve" the web framework problem, or the dependency problem, etc.

------
oever
_Instead of nurturing narcissistic language ambassadors that drop their
projects like they change fedora hats every time they get a new idea, let’s
create more tools to improve code quality and foster a sense of community.
Human progress is not going to happen by default. That was not the case with
clean energy, or with quality education, and it will not be different with
open source fragmentation. These problems require active monitoring and
organized effort._

This is why I'm part of KDE. It's a large, diverse and productive community of
people that want to bring Free Software forward. KDE exists 20 years this year
and it's still growing and evolving. New developments come in, but they
reviewed and nurtured in the community before being released under the KDE
flag.

JavaScript could use a community like that where there is a common set of
tools and values. The JavaScript that I see out there usually has very little
quality control. It's easy to make something that looks nice and does not
crash. But scaling up to an application that is complex and stable is hard. I
learned this when developing the (now resting) library WebODF. Javascript
comes with great tools like JSLint and Closure Compiler, and Jasmine but these
are rarely used strictly.

Very few JavaScript developers have read 'JavaScript, the good parts' which is
essential reading when writing non-trivial JS.

Node.JS promises the ability to reuse code on the server and the browser, but
does not provide a module solution that makes that possible and works with the
tools mentioned above.

Competition between KDE, GNOME and others on the Linux desktop exists because
there is only one desktop on your computer. Javascript lacks such a focal
point and JS framework developers can start new projects because it's easy to
have a different half-baked framework in each browser tab.

You cannot build a cathedral out of market stalls. (KDE is the Sagrada Familia
in this simile)

------
acjohnson55
This sounds pretty much the same as the original concept Javascript fatigue.
Too many libraries.

That's an interesting example cited by the author considering the behavior he
desires isn't standard. There are multiple possible approaches to serializing
an object tree to a query string (and reasons why, conceptually, you might not
want to do this in the first place). Incidentally, when I searched for this on
StackOverflow, the Q&A I found has a top answer that does deal with nested
query strings [1].

[1] [http://stackoverflow.com/questions/1714786/querystring-
encod...](http://stackoverflow.com/questions/1714786/querystring-encoding-of-
a-javascript-object)

------
nzoschke
I see his point and I think another dimension of the tooling is a root cause
too...

> Recently I needed a library to build query strings, just a small one so that
> I wouldn’t have to include jQuery just for that. After a couple of hours of
> research...

jQuery clearly is the best library to look to for a mature implementation. So
why not just use it? Hours looking for alternatives is actually really
expensive.

The justification is that jQuery is too big.

First It's probably not too big. CDNs and browser caching exist to optimize
the problem upstream.

Next, the dynamic nature of JavaScript is to blame. You can't easily extract
and compile just the functions you need.

Ideally we could all leverage bits of jQuery instead of poorly rewriting parts
of it in the name of minimalism.

~~~
scotttrinh
> Next, the dynamic nature of JavaScript is to blame. You can't easily extract
> and compile just the functions you need.

Definitely check out rollup, if you haven't heard of it. We (the JavaScript
community) are working towards solving this!

~~~
nzoschke
Thanks for the tip, rollup looks really powerful.

This and related comments remind me that the latest ECMA stuff offers a
brighter future. The challenge is getting there...

Can/will jQuery work with rollup and all the other new module work? Or does it
have to be effectively rewritten?

~~~
scotttrinh
It needs to be rewritten, unfortunately. However this isn't just rewriting it
so it works with one tool, since this is the new module syntax. I imagine that
a future version of jQuery, and probably most libraries going forward, will
_want_ to take advantage of ES2015 modules, even if it's just to aid in their
own internal development. Since we have module bundlers like rollup, it would
be easy to still provide an ES5 bundle for environments that do not use ES2015
modules yet.

------
keeringplastik
My java script fatigue led me to disable it on my phone.

Seriously. I's been about nine months.

I use the exceptions feature to add in common sites where it is required.

I have been surprised over the course of this experiment by how little it is
really needed for casual browsing.

And it has led to a decline in my consumption of garbage Internet by forcing
me to take the time to add the exception which leads me to question whether
this content is really worth the effort.

I think about building a feature that can do quick, single instance,
javascript exceptions.

However, I fear it would undo the good done by the natural filter on my
surfing.

And as a bonus, my data usage decreased considerably.

------
Cozumel
Not knocking the guy at all but..

'Recently I needed a library to build query strings, just a small one so that
I wouldn’t have to include jQuery just for that. After a couple of hours of
research, I had found several candidates'

That right there is the problem, learn your language. You don't need to spend
2 hours looking through other peoples code to solve your (string!) problem.
Learn the language and write it yourself, it's not 'rolling your own' when
it's something so basic.

------
scotttrinh
The basis of the complaints raised in this article, is that they didn't want
to include jQuery just for the query string parser. I think rollup and ES2015
modules is taking us in the right direction which allows large projects like
jQuery and lodash to be kitchen sinks, but to be used piecemeal. I think that
will get us a long way toward the less-sensitive-to-filesize ecosystems that
were compared in this article.

------
rosalinekarr
Maybe what we need is a better system for assessing and comparing javascript
projects. In the Ruby world, RubyToolbox has a great system that tells you
about gem's popularity, update frequency, age and a ton of other stuff.

Is there a similar project for the javascript world?

~~~
sebcat
Or just review the code. It's not hard, and if you add a dependency to your
code base you should review it. If you lack the ability, time or <insert
random excuse here> then you should not add it as a dependency.

DRY only goes so far. If you can do the same thing with a couple of hours of
work, it's probably not worth adding an external dependency that you need to
track over time.

------
g8oz
_Remember Google’s Polymer? Angular 1? Express? Perhaps the organizations and
individuals which cause these abrupt termination events should carry a
stigma._

A fantastic idea. There should be some social consequences for these sort of
faithless and feckless types.

~~~
Illniyar
Angular 1 is a shame, and it's absolutely right that it shouldn't have been
abruptly replaced (though not terminated).

But polymer was and still is barely an alpha, and was practically not used for
any real usage (which isn't basically a showcase).

Express is 7 years old and still running strong, continuously being updated
and was not terminated or abandoned. Kue.js is just an extension of Express.js
with new ideas to accommodate the changes underwent in these 7 years in the
node.js ecosystem.

~~~
angersock
Yeah, I still use Express all the time. I wouldn't put it--or Angular 1--on
that list.

------
woah
This doesn't really add much to the discussion. Sure, write better code. We
all want to do that.

To the author: I believe that the node standard library has the query string
tools you are looking for.

~~~
endergen
My take away was that we need to work towards converging efforts rather than
diverging and causing constant fragmentation and lost effort. It being easy to
deploy is a good thing, but people need to try to reduce noise and fix
existing good enough projects to make them great and robust.

Contrary to my above points, I'm super in redux and react and feel they were
large leaps in design over previous ones. I've been happier and happier with
react over the last two years. Redux is also consistently making me smile.

~~~
zzzcpan
> My take away was that we need to work towards converging efforts rather than
> diverging and causing constant fragmentation and lost effort.

Fragmentation and lost effort are essential for progress. Things that you use
today were created because someone decided not to work towards converging
efforts.

------
andrewclunn
Wait, Angular 1 is dead? That's news to me.

~~~
throwanem
It's soon to be no longer developed, because the Angular team saw a React talk
and decided to build their own version that nobody seems to want very much.

------
Illniyar
There is another upside to modules being easy to make and publish - they are
also easy to change.

The query parser you want isn't able to handle nested objects and arrays
easily - fork it, add your small piece of code (to what is hopefully an
already small module), use it with a direct reference in your package.json,
send a pull request upstream.

The fact that libraries don't handle your specific edge case/ special need is
not unique to javascript, atleast you have an easy way out in node's mentality
of small modules. Try getting a small change to work in spring or hibernate or
any of the other massive frameworks that require days only to get familiar
with their api and lifecycle hooks.

Consider this, I can easily go into the most popular framework in node
(express.js) and change any part of the code within a few hours to match what
I want, including adding tests, and there is non-zero chance that my changes
will be pushed upstream. If I ever needed to do this to Spring MVC or even
Racket, it will take me weeks, would most likely end up breaking dozen of
other edge cases and would never be accepted by the Foundations managing those
projects.

~~~
vivin
Not my experience. I recently submitted a bug fix to Qt WebKit and feature to
swagger. Neither of them broke anything and both were accepted. The submission
processes for both projects were diligent enough to ensure quality
contributions yet simple enough that they were not byzantine.

