I find more and more sites using JS for layout and it being an incredibly frustrating experience on my phone. The whole site loads and I'm reading the article for 4 or 5 seconds and then the js kicks in, scrolling locks, everything stops while The page redraws. On some sites, including Android Centeral of all things, this can stop me from reading text I ALREADY HAVE for upwards of 20 seconds while it reflows. Simply unacceptable.
ALWAYS keep in mind that people are coming to your site for the text almost exclusively. If your stupid JavaScript interferes with the consumption of said text you have failed as a developer.
At this point my mobile experience is go to website, read a couple of sentences, watch javascript muck with the layout as you describe, then a full screen popup wanting me to register or download the app, followed by a part of the top of the screen with a app store link for the app, then I get to read the article - maybe.
I think we need something that is to websites as markdown is to word processor files.
It's not exactly what you described, but in my experience "Clearly" (https://evernote.com/clearly/) does a v nice job delivering a clean, readable, stripped-down "just the text, please" version of a given page.
On the desktop you can get that with Opera 12. It has a toggle to switch off CSS (and of course js) and you can make it a button next to the adress bar.
I find myself hitting the view source button to read (mangled) text in JS source way too often.
Or worse, unmangled text in JS source. Or text that's set to display:none until the JS gets to it.
If you're going to compile things clientside (looking at you, markdown), at least put the raw text in something the browser can display before the JS compiles it.
The corollary of that is that "have to support old browsers" nonsense. Perhaps if everyone stopped supporting older browsers, the world would be a better place.
Modern browsers are free, and if someone has a particularly old, underpowered computer, there are lightweight ones available (eg Midori - used to use it on an raspberry pi), though admittedly those have less mindshare.
Right, but these problems are not specific to javascript requests. Any request can fail, be blocked, interfered with, or partially received. Users can switch off CSS, too. Html page may not load completely. So what?
Some of these issues are fixed with https or http/2. Some with CDN fallbacks or just sane coding. Some are not developers problems - such as clueless users installing malicious extensions or luddite users switching off js.
The point being made is at the bottom - progressive enhancement. If at least the html makes it across the wire, a progressively enhanced website will still "work": you can still read content, click links, submit forms
The web is no longer a collection of text content with links and forms. Many, is not most, web apps cannot be represented using html alone as progressive enhancement suggests. The core functionality just isn't there.
Bullshit. Every day I see blogs and static pages that deliver the content for articles with javascript. I'll get to a blank page, disable no-script, and can suddenly read the article. Why aren't sites like those using progressive enhancement?
That's exceptionally common with news sites, I've found. My theory is that there is so much tracking, advertising and "optimization" happening that it destroys the reason that brought you there in the first place.
Re-read the linked comment. There is a distinction to be made between content-based web and application-based web. The latter absolutely needs javascript (or your application language). If your site is part of the former category though, you should make sure that you keep your focus on the content; all else is secondary.
Also, I'd argue that the application-based web is still a minority compared to the content-based web, and you should check what side your site is really on.
Most sites aren't web apps--they're content-based, with text, pictures, and links. Progressive enhancement makes the most sense for the majority of content on the web.
Web applications are certainly useful in some contexts, and progressive enhancement may not be the best approach if your application requires a lot of extended interaction. But building a web app to display primarily text- and picture-based articles and documents is misguided, and it does web users a disservice.
Initial server-side rendering is still an option. User will be able to read the page, at least. And if URLs use a sane scheme, navigation would work too
Should every chat app have a non-JS fallback option? Do we need to implement all color scheme, layout and templating tools in pure CSS?
Facetiousness aside, the web is most definitely not just text based anymore. Whether we like it or not, the web is now in part visual (and aural) and is becoming increasingly moreso. This is not a bad thing. It provides a platform for rapidly developing and deploying applications that cover the full gamut of uses we have thought up for the personal computer.
Using JS to simply display static content is overkill and not the best use of the technology and it is absolutely correct to say that this is better served by using Plain Old HTML. That is not the entirety of the web anymore and the more I see this argument that every web app must support a text only interface the more it sounds to me like an argument for every television show and movie to be produced as a radio drama with pictures.
I'm not arguing against providing a decent notice to people who's browser set up is not compatible with the technology being used by the application. I believe that, where possible, elements should provide support for people who's interaction with the web may be limited whether by physical disability or some other limitation. I'm not advocating dropping semantic tags and serving all text by inserting it into the dom, I just believe that the web is moving beyond being simply a platform for disseminating static content and while that remains important we should not let it hamper efforts to evolve the web as a platform for new technologies and uses.
I suppose I should have more verbosely indicated the facetiousness of the first few rhetorical questions but they do serve a point.
Not an option. Concrete examples: anything webgl, image editing, games, even a simple curl app from yesterdays post (curlbuilder.com). Implementing curl app server-side would require double work which is against principles of progressive enhancement (or programming in general).
Implementing http://www.curlbuilder.com/ with progressive enhancement (without duplicate code) is trivial - in fact it would make a great tutorial on how to do progressive enhancement.
Just implement it as a regular form-based application, then add some JavaScript which Ajax-submits the form data and loads in the generated curl command on each keystroke.
(Or you could use server-side JS and share code on the client and server)
> Implementing curl app server-side would require double work which is against principles of progressive enhancement (or programming in general).
That's exactly the use-case of server-side JavaScript! Write a small library to build curl commands; use it in both the web page's JS and on the server, with a small shim on each. 1¼ times the work.
Almost a case for jQuery style of web design. Even though cleaner, more solid js frameworks allow for better apps, it renders the web goal dead, moving toward some kind of prototypical Self/SmallTalk environment, too dynamic, not enough core state that can watched without much specific machinery.
> such as clueless users installing malicious extensions or luddite users switching off js.
So if I run arbitrary, potentially-malicious code from the Web, in the form of a browser extension, I'm "clueless"?
Yet if I don't run arbitrary, potentially-malicious code from the Web, in the form of JS, I'm a "luddite"?
Considering that many browser extensions are implemented in JS, the only difference is in control: users can pick and choose which browser extensions they want to use (I use a few; I've skimmed the source of a couple and written a couple myself), whereas enabling Javascript turns a machine into a third-party free-for-all.
Extensions have access to browser APIs not available to webpages and run on your every request. Thus they are inherently more dangerous. So installing unsigned extensions from questionable sources makes you clueless, yes.
Vanilla js running on webpages should be considered relatively safe. It's sandboxed and not able to do anything - malicious of not - outside it's restricted environment. It can't even get your location or lock a cursor without asking for permission first.
If you could somehow disable all logic from executing in your OS native apps, would you do this by default?
> Vanilla js running on webpages should be considered relatively safe. It's sandboxed and not able to do anything - malicious of not - outside it's restricted environment.
The number of exploits requiring JS versus the number of exploits not requiring JS seriously disagrees with that opinion.
JS is a dynamic language that browsers put a lot of effort into JITting to be fast and limiting memory usage. And as such, the engines are complex. And as such, it's inherently hard to keep safe.
The v8 engine has literally over a million lines of code.
> If you could somehow disable all logic from executing in your OS native apps, would you do this by default?
Yes. And I actually do this on a regular basis. There are a lot of things I'll only run in a VM, or run in a VM the first time to see what it's attempting to communicate with / do (and occasionally block said communication for later). Not perfect, but better than nothing.
>It's sandboxed and not able to do anything - malicious of not - outside it's restricted environment.
That argument is like the one that Linux is more secure because a malicious program will only have user rights.
Malicious scripts can do plenty of damage inside their restricted environment. There are whole books on how to securely write web apps due to the danger of "sandboxed" malicious JS.
Personally I don't want to be tracked and profiled yet nearly every major website is running multiple tracking and profiling scripts.
>If you could somehow disable all logic from executing in your OS native apps, would you do this by default?
I do disable native apps ability to communicate with the network unless they actually need it.
P.S. I also find that those who take a pure "webapp" approach introduce design flaws into their apps. In Toggl for example when browsing summaries you can't open multiple tasks in detail because instead of using a hyperlink they use JS to fetch and display the task details. Any content link should be able to be opened in a new window/tab.
> If you could somehow disable all logic from executing in your OS native apps, would you do this by default?
I've certainly recompiled native applications to throw out 'features' I don't want, but my "default" is to just not install such programs to begin with. Especially these days, since I tend to use minimalist "do one job" programs, rather than giant "desktop environment" stuff like Gnome/KDE. That applies to my phone too, since it runs a stripped-down install of Debian.
I'm not sure I entirely agree that JavaScript should be developed in a Progressive Enhancement style, for two reasons:
If you can live without it (the "progressive" part of progressive enhancement implies that the site is still usable without the JS), then I don't think you should be using JS at all. Semantic HTML and CSS are pretty amazing at adapting content fluidly and dynamically to a wide range of users and devices. Search engines expect documents to be laid out in a certain way. Web scrapers can add a whole pile of usefulness even to sites that don't create explicit public data APIs. You're going to have a serious hell of a time making it any better, and can easily make it a lot worse.
And on the other hand, content sites aren't the be-all, end-all of the Web anymore. The Web is increasingly a deployment platform for applications, and there are a whole host of applications that just can't be done in the browser without JS. Luddites who complain about "but I got JS disabled" aren't the target market for an image-editor-in-the-browser. Failing to load JS at that point is just plain failure to load the application and no amount of Progresso Soup is going to help you.
I am one of the "luddites". While it's becoming impossible now with all of that JS Overload, I would love to be able to go to a website with JS disabled, and read the text explaining why I should spend more time on the website and trust the website enough to enable JS or some other unnecessary (in the context of content consumption) plugin.
The web is trying to do two things at once right now: Be a platform for applications and be a medium for textual, document-like content.
The former obviously needs a proper scripting language. The latter should definitely be built with progressive enhancement in mind because it's a good thing.
Why this urge, on either side, to immediately apply their issues to an area which doesn't warrant it? Apps can't be progressively-enhanced without making things stupid. Documents shouldn't rely on javascript to be readable (in most cases).
> I wonder, have there been any experiments in major browsers to execute non-javascript code?
Chrome currently supports NaCl, which allows running native-compiled non-JS code in the browser, but doesn't provide direct DOM access. A number of technologies have been used to provide non-JS code in browsers through plug-in APIs. And IE has directly supported, at different times, a number of non-JS languages in the browser as well.
IE was able to use whatever was installed on Windows, basically from the very beginning, but VBScript was the only one that was commonly used (though I think there was even a version of Lisp that worked). Mozilla (back before there was Firefox) had limited support for VBScript because of this. I remember being terrified in 1998 that there was a good chance VBScript would take off. I think I even had to make a site in VBScript to satisfy a client as late as 2006 or 2007.
Later versions of IE were able to run .NET code, which was what SilverLight was all about.
So imagine that, it was basically just MS leading the charge, which is probably why we only have JavaScript now.
> that just can't be done in the browser without JS.
And that's fine. I don't think any one should complain when you open a webapp and it requires you to have JS enabled. The problem is that when I go to a website with JS disabled, I often don't get any messages, but blank or simply broken pages.
If you can live without it (the "progressive" part of progressive enhancement implies that the site is still usable without the JS), then I don't think you should be using JS at all.
It's a disingenuous argument introducing an artificial dichotomy.
If I can make a working website in HTML, but it works better with some additional JS, why exactly shouldn't I improve user's experience?
And on the other hand, content sites aren't the be-all, end-all of the Web anymore. The Web is increasingly a deployment platform for applications
Somehow I don't see a lot of these "applications". All I see is overly complex website that don't work without JS because developers don't know any better or don't care.
My point is, you probably can't actually make it better in even the majority of cases. I haven't seen a single site get hijacked scrolling right, yet. Scrolling is for the browser.
And I do see applications in the browser, because I write them. It is really hard to imagine how 3D graphics and virtual reality have anything to do with traditional HTML and browsing habits.
Firstly and this may suprise you you... We know you exist. Yes you with a tinfoil hat, you under a repressive corporate regime. You keep trying to make your presence known through ranty blog posts, angry comments thinking we dont know you exist. We know but we found that not only are you a minority but our lives as designers and developers are a million times easier if we just ignore you. Yes i know you want to surf with vim or IE6 but thats your choice. You make your bed now lie in it but dont drag the rest of us with you!
The lives of designers and developers are a million times easier if you just ignore accessibility requirements, too. But you know what? If you can't make a good design that caters to a variety of consumers, then you're not a good designer. Design is about melding form and utility, not just making something pretty. The less utility you have, the more you are an 'artist', and the less you are a 'designer'. Besides, if you don't care about anything but the most common, optimum user, you may as well return to the "This page best viewed at 800x600 on Netscape Navigator" '90s.
> You make your bed now lie in it but dont drag the rest of us with you!
By taking an extreme position for the user, but not for the designer/developer, you're not making a fair comparison.
In the same way that designers don't force all visitors to calibrate their monitor colours before opening the site in Photoshop, to be as "authentic" to the "experience" as possible; likewise those of us browsing in Emacs or whatever don't expect your Unity-powered Minecraft clone to work for us.
What we do expect is that text-heavy, 'unavoidable' sites (eg. for governments, banks, etc.) aren't built inside Unity-powered Minecraft clones.
I didn't get the point this website is trying to make. All distributed apps are like that at time of delivering the code. If that's so wrong with Javascript, what's the proposed alternative (that has a chance of being available everywhere)?
I think the point of this is more against requiring some JS to load and work in order to do things which do not require JS in the first place, like displaying some static text or laying out the page.
It also highlights the importance of progressive enhancement.
First thing I did, as I usually do with sites that have annoyingly small text width, was inspect element to increase contrast / width / remove the doublespacing. Then I read it. ...irony.
The thin-client revolution is finally here, and what an awful pile of shit it is. Whatever happened to software that just works?
Answer: Microsoft. Their legal team started the browser-as-OS delusion with their 1990s freakout about Netscape; this was an anti-consumer, anti-competitive lock-in strategy which now masquerades as a fundamental principle of Good Clean Living for people who give speeches about their startup's Beautiful, Human Responsive Javascript Libraries or their genius-revealing reimplementations of Emacs that can only edit one language (namely Javascript.)
Hence today instead of having a computer, we have DMCA-hardened thin clients, and with each click we download what amounts to a freshly-coded never-debugged malware-infected EULA-wrapped software update; and when you enter a subway tunnel the whole universe stops working.
Yet on the linked essay he likens turning off Javascript to removing the steering wheel from your car, which I find to be both idiotic and revealing: Your steering wheel doesn't stop working when your car goes into a tunnel. But you can see why a Googler thinks like this: Google makes a car with no steering wheel, which really might stop working properly when it loses signal in a tunnel. The real benefit of Thin Client is to the employment prospects of Javascript programmers, to the Google whose Android and browser and Javascript VM you will need to do anything useful or performant on Googlephones, and to the war criminals in technology, finance, and intelligence who carry out the destruction of human culture via airdrop of free surveillance-gathering Javascript phones, where Responsive Javascript libraries conspire with distant servers to jack up the valuations of this or that group of ten HN-darling companies, and hire three or four guys to make sure the top comment always defends Apple's weekly anti-competitive "this is what's best for consumers" move, or reminds users that There Is No Way To Prove This Isn't Another Tech Bubble.
The whole concept of browser-as-OS has turned out to serve the permanent security state and other old-money beneficiaries of the Bronze Age kulturkampf. Thanks to the thin-client revolution and DMCA, secure software that actually WORKS is mostly illegal.
That's quite the dystopian picture you paint there. I especially love the exaggeration for effect you weave throughout the essay as well as the masterful way you ignore any trends that counter your point.
You have expressed many problems with js clearly out.
Is there a solution ?
I do not see a better alternative, I didn't start programming in js out of spite. I tried to learn the old neckbeard C infrastructure formally and it seemed to be a mess. Even though I understand how most of it works, if you are trying to create products of value to people you really cannot use it.
Hopefully my kids will experience a better experience with computers but for now js will do.
If an image fails to load, the browser draws a little box with some alternate text describing that box. If the CSS doesn't load, your text and content is displayed in a weird font without the grid layout you were using, but if you wrote your HTML semantically (using <h1> instead of <div class="title"> etc.), the browser can still show most of your content, and you can still move around on the page.
If the JavaScript fails to load and you were using it to significantly alter the content on your page, for example loading a news article asynchronously, the entire page fails to load.
I don't mean to pick on this app in particular (I actually think it's really cool and I plan on using it and learning from it), but take a look at what happens to http://hswolff.github.io/hn-ng2/ when you switch off JavaScript--it's completely unusable. Now try switching off JavaScript on Hacker News--all the links and comments are still there.
> If the JavaScript fails to load and you were using it to significantly alter the content on your page, for example loading a news article asynchronously, the entire page fails to load.
Worse. Half the time the content is sent synchronously in the initial HTML, but keep hidden until a JS script has its way with it. Looking at you Markdown.js.
Lazy, that is what I call it... In the past the rule was degrade gracefully if something keeps the JS from running. Now days its like "screw you, we can't be bothered to make it mostly work with css, so your out of luck if the JS didn't run".
For me, I run with noscript. Most of the places I frequent (here for example!) work great without any JS, other places less so, but I can selectively enable what needs to run, and many times when I find those sites that simply don't work without javascript, I just click back, and ignore them. I don't trust sites that can't at least load some basic content to convince me they aren't just a malware vector.
Ten edge cases still makes for an edge case. It's a good to think about what we're doing but for all practical reasons, yes "Everyone has JavaScript, that's right!"
I don't think the usability of a site or load times should require caching. Caching should enhance, but it should not be a fundamental requirement to make a site not suck. Caching is salt and pepper but shouldn't be the steak.
While the point may be true, it's like don't drive a car because people do die in car accidents.
The post is misleading as most of the cards present rare edge cases where JS doesn't work. Show percentages as well.
Yes JS may not work 0.1% of the time. So let's fix it once all the higher priority issues are fixed.
There are 2 interpretations of the situation. Either that 1.1% is a totally pressing issue and almost the entire industry is an idiot for not getting it; or the industry is actually pretty smart and also market forces do their work and it turns out it's not as a pressing issue relative to other things. I like to subscribe to the second idea.
Accessibility for 1% can be a pressing issue if you consider your site/service as something important and you do not want to keep disabled users out. Just as food for thought. Letting the market force people away is not very social.
That's in the same ballpark as saying there are plenty of reasons to not use fossil fuels or there are plenty of reasons to not use English as a lingua franca or plenty of reasons to not eat meat.
There are lots of good reasons to not do any of those things but until you come up with a universally acceptable alternative and convince the majority of people to adopt it, you're shit out of luck.
If CDN's based in the US need to do maintenance, they will probably do it late at night in the US, which might be at peak time for you if your website is based outside of the US.
Did you even read the linked page? People turning off JS is one of 8 reasons given. In my experience (never turning off javascript) it's pretty reasonable. My work firewall sometimes blocks CDNs, so any website not serving their JS from their own domain may not load properly. I often browse on my phone on a train. Reception is patchy at best. You load websites during short windows of opportunity.
Maybe you don't care about people who turn off javascript, but perhaps you should care about people at work and people on phones.
Using IE5 and Netscape Navigator as straw men to ridicule progressive enhancement completely misses the point! Progressive enhancement is about ignoring user agents.
If you use PE, then you just-so-happen to get IE5 and NN support, but you also get support for Chrome, IE10, FF, Safari, mobiles, etc.
If you don't follow PE, you're playing a high-stakes game of whack-a-mole: even if you only care about a few browsers, the combinatorial explosion of version/OS/screen-size/connection-speed/etc. configurations to keep track of can get ridiculous. If you miss out one combination, it can be game over: lost customers due to a white screen of death, or something equally unusable.
You could keep pouring more effort into whacking moles, or you could reduce the stakes of the game. If your "game over" scenario consists of a product screen with images, description, reviews, a search box and an order form, but no hover-over zoom feature, it doesn't matter anywhere near as much. You no longer need to worry as much about missing a few configurations, and can focus on fixing bugs and adding features. The fact that it might also work on IE5 and NN is just a bonus; it's certainly not the point.
Most people create websites to make money..either by selling their services directly or advertising themselves or someone else. Not building your website to support as many people as possible is like creating TV Ad that only works on TVs that support HD..
Actually a number of ads have content outside the bounding box for SD streams (though some SD streams now cut down things even more by letterboxing because of this). Ads were the first and still the most aggressive to have content strongly outside the 4:3 screen type.
Because if you can create an effect that works it can be worth more than the whatever small number of users on SD, especially as those are likely out of demographic anyways.
ALWAYS keep in mind that people are coming to your site for the text almost exclusively. If your stupid JavaScript interferes with the consumption of said text you have failed as a developer.