Hacker News new | past | comments | ask | show | jobs | submit login

>A complete web framework has client-side components that facilitate two-way communication to/from the server, data binding, HTML5 history state-based page/view swapping, etc. In other words, the type of stuff being addressed by Angular, etc.

No. I can't possibly disagree with this any more strongly.

I'm probably on the wrong side of history here, but Javascript absolutely should _not_ be a requirement for using a website. If your website doesn't work with NoScript turned on, I won't utilize it. Full stop.

In my mind Javascript should be used for progressive enhancement for those users that opt to enable it (or I suppose more correctly: choose not to disable it). Building a thick client web app is the fastest route to ensuring I won't use it.




I used to think like you, but I've since understood since that this is the wrong battle. You have 2 things that are more and more diverging yet use the same platform for distribution:

- websites as-you-know-them, which goal is to provide _information_

- webapps, which goal is to provide _services_

Your point of view is completely valid for the first kind: when you want the information, you don't want the frills that go around it. You want clear pointers to it in the form of clean URLs. You want it to be accessible on your smartphone that doesn't run javascript because it's heavy and has wildly different inputs.

But the second kind is completely different. When you (as in the general you) are using GMail, you actually want to use an application to manage your mail, possibly send and receive them. Whether it's in your chrome browser or native in your OS matters only as far as how easy it is to install, and on that point the web has won. But it's just a happy accident of how things have evolved.

You are not against javascript per se, you are against web apps in general. Which is totally understandable, because there are better ways to provide services on a computer than using a shiny HTML+CSS+Javascript interpreter that fetches programs on-the-fly and can't even do half of what a real OS can do.


Similarly, I find myself wandering lost and confused in the valley between "app" and "document". I've been doing front-end "app" stuff a few years now, mostly because I wanted to write JS and thought that servers were scary and complex. But now that I've mostly gotten over that, I look at requirements and user stories and ask: could this just be done with HTML forms? (Sure, so maybe we want some animation and fine-grained interactivity, but most of the time these could just be frosting on the HTTP cake.)

I guess some things are really "documents" (Wikipedia articles) and some things are really "apps" (HTML5 games), but it seems like there's a huge swath of uncritical groupthink that's saying you're web thing needs to be an "app" because iPhone.


>If your website doesn't work with NoScript turned on, I won't utilize it.

You're definitely in the minority then, most users don't even know what javascript is let alone NoScript. I find it ridiculous that people are arguing against javascript, we're using a computer and not allowing it to run code.


I was, and am, mostly on the side that argues that client-side javascript support can be assumed to such a degree that it can safely be considered requirement in a lot of cases.

However, I do have some reservations of a more 'ideological' nature when it comes to requiring javascript when it isn't strictly necessary.

I still believe one of the most powerful things about the web is the (relative) simplicity of the request/response server-side-rendering approach. There are so many ways in which a web page can be consumed (scraped, read-later-tools, etc.), within 'fair' boundaries of course. And this often falls apart because of client-side javascript.

This is fine when it concerns a web-app that is specifically made for browsers (where ideally there's also a public-facing API). But for most other things it should not be necessary, and I believe it harms the possibilities of the open web. And quite often I find that it deceases the usability in general (back-button breakage, etc.).

Of course, my ideal solution is both server- and client-side support, used appropriately, through something like React or PhantomJS.


I'd be curious what you think of this: http://platform.qbix.com/guide/pages


we're using a computer and not allowing it to run code.

...from sources which some people find uncomfortable to be blindly trusting. From a security perspective, it's a huge attack vector - the majority of browser exploits require JS to work. Even if it's sandboxed, there's also the annoyance aspect; scripts can consume CPU and do all sorts of irritating things like breaking the back button and seizure-inducing animations.


> seizure-inducing animations

You don't need JavaScript for that.

    div {
      animation: seizure 16.6ms infinite alternate;
    }
    
    @keyframes seizure {
      0%   { background: black; }
      100% { background: white; }
    }


You are welcome to hold that opinion. Just keep in mind that the majority of users don't even know what javascript is or how to disable it. Depending on the audience, the fact that you don't want to enable javascript may or may not be a concern to the developers.


I respect your opinion, but I will happily disregard it when it comes to building web apps. If that means I someday lose you as a customer, so be it.


Hear, hear!


I couldn't care less about losing you as a visitor to my site. It's 2014. JavaScript is required for using the web.


JavaScript is required to read any article? Why?

The apps that I work on require JavaScript, but the landing pages for these apps do not. The content sites that I've worked on also don't require JavaScript.

Not once in my life have I been reading an article on the web when I said to myself, "Man, this article sure could use some JavaScript."


Not once in my life have I ever considered switching off JavaScript. How completely silly.


It's not "completely" silly at all. For "normal" users, it's true that they may never need to do this. It's also a largely "safe" assumption that your site's visitors will have JavaScript enabled.

However there are security concerns when dealing with JavaScript. For example, if you've ever surfed the darknet, disabling JavaScript is highly recommended to avoid present and future exploits that could compromise your anonymity, funds, etc.


You are fighting a losing battle.


I suspect that many sites simply do not have staff and/or resources capable of producing NoScript-friendly (degradable) work.


Producing NoScript-friendly work is, oddly, ridiculously easy. You write your "view" code for React.JS, and render this via node.js on the server, then have the client do exactly the same. You can pick up libraries - or write your own in half an hour - which implement the glue to talk to models from both the client and the server.

You shouldn't even need to write your business logic in node.js if it doesn't fit the language well, just write an RPC server in whatever language you prefer.


I've never seen a coherent argument against progressive enhancement on served HTML.


Look, I'm all for progressive-enhancement driven development where it makes sense. But there are (many, many) websites where it doesn't make sense nowadays.

> If your website doesn't work with NoScript turned on, I won't utilize it. Full stop.

I take it you don't use Youtube? Even for the parts of the web that do work without javascript, enabling js usually offers an enormous improvement in usability. I don't want to reload the page every time I upvote someone on Hacker News for example.

While I don't believe there's a canonical definition of what constitutes a complete web framework, I do agree with your main point though; things like data two-way communication and data-binding shouldn't be a prerequisite for qualification as such. That would imply that 'complete web frameworks' didn't exist before 2011 or so.


I think the turning point is already ahead of us where the thin client server-heavy webservices will be considered a "smell" or at least the uncanonical way of developing web apps.

It pays off to do a lot in the client (good libraries provided). You have a high frequency of development iterations, direct response during development and you are working closely to the "medium" of presentation.

I have had a project that started out as a flask web app and all of the sudden, I realized that moving things to the JavaScript world in the client (I was using ClojureScript there) was actually safer, faster and better than keeping it on the server. In the end the server Did validation and database access in the sense of a Web API. It really was a clean solution


What is your aversion to javascript? As a client-side scripting language, it enables us to deliver, much, much richer UX to the end user. It should not be put aside as just for 'progressive enhancement'. That's ridiculous. This comment wins my nomination for most backward of the year on HN.


Because for all the tech progression in the last 20 years, a javascript heavy page is still really slow to load and use if you have more than a few tabs open.


Some might say you were on the wrong side of history.. If they even knew where the line got to. I doubt anybody has seen it in the past 5 years, so who is to say?




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: