

JavaScript’s Dark Side - johnr8201
http://huffduffer.com/adactio/68511

======
hrktb
I am still not finished with the talk, but the description don't do it
justice. The presentation text's tone is off, stereotyped, it's just plain
wrong.

Now, the talk is more sensible and balanced. I wouldn't say it's brilliant or
brings much new ideas on the table, it's really about describing different
points of view and different approaches to graceful degradation.

My main grief would be to have an audio feed of a presentation commenting live
site and tools demos. Obviously video would be appropriate.

------
davej
Here are the slides: [http://speakerdeck.com/u/philhawksworth/p/excessive-
enhancem...](http://speakerdeck.com/u/philhawksworth/p/excessive-enhancement-
sxsw2012)

------
mattmanser
_Good developers understand about graceful degradation, progressive
enhancement, unobtrusive JavaScript and the like_.

I think you were looking for the word pedantic, not good. It's as stupid an
idea as semantic html. Thank god that one's dying off before html5 really
takes hold, that could have been a right mess.

You are sitting back being the past and calling it progress, while scorning
others for trying to find the future.

~~~
jaffathecake
Do you have any evidence against progressive enhancement or are you just
trolling?

We built <http://m.lanyrd.com> using progressive enhancement, it didn't hold
us back, quite the opposite, we were able to support devices old and new. In
fact, for older devices, we use a few tricks to stop them parsing most js,
keeping the performance snappy. Newer devices get offline capabilities. Yet an
old devices can copy and paste a deep link to their friend with a newer
device, the urls are the same and meaningful.

In terms of maintenance, I know my content's in the html, my design is in the
css, and my behaviour's in the JS. Calls to the JS that do per-page enhancing
are made at the bottom of each html page, so it's easy for developers to
follow the path of execution. It's explicit rather than implicit.

I'm not saying all sites should work without JS, for instance,
<http://www.spritecow.com> depends on it, but content driven sites should
avoid JS dependancy. Ever followed a link to a tweet? Eg
<https://twitter.com/stopsatgreen/status/181686984371208192>, a redirect to a
meaningless url, 40+ http requests & over 950k to display a 140 char tweet? I
wouldn't call that progress.

Twitter could be showing me the content within 100k of data, and that's giving
them room for a massive css file. Instead I'm waiting for 500k of JS before I
get anything.

~~~
kamjam
Not sure I quite understood his rant either, except perhaps he probably has no
idea how a browser should behave or be able to accomplish certain aspects of
development without it...

Progressive enhancement is not difficult. Just build your site as if there was
no Javascript and then add the JS bits. Easy. With all the js libraries about
it's even easier these days, use something like yepnope.js and it'll even
allow you to load scripts based on what capabilities the browser supports.
Progressive enhancements becomes difficult when it's an after thought, maybe
when you going into testing and suddenly realise the developer hasn't
completely read the specs!

I totally understand for some sites it is completely necessary, but for sites
like Twitter and Facebook it should not be necessary. Sure the JS adds a gloss
and finish to make the user experience slicker, but there is nothing on there
which could not be achieved with plain old html and css...

Also, for some of us we still have to support sites with JS turned off and
even (unfortunately) support IE6! It's the requirement of the client and
although most likely none of their customers will have JS turned off or run
IE6, the corporate big wigs in head office will need to give final approval!

------
Tim-Boss
I couldn't get past "Phil Hawksworth is a Technical Director at R/GA and
enjoys talking about himself in the third person." without cringing!

~~~
sk3tch
That, dear sir, would be a joke.

