Hacker News new | past | comments | ask | show | jobs | submit login

Some sites require JS because they do weird things like rendering stuff clientside.

This site "requires" JS because... The opacity of the article is set to 0? Everything's there, everything's rendered, but they have a CSS rule (".use-motion .post { opacity: 0; }"). Only thing I can think of is that they do a JS fade-in or something and missed the unintended consequences.

(Speaking of which... Doesn't Google penalize sites that have an excessive number of hidden keywords? I wonder if the entire article being hidden qualifies...)




Today's web is ridiculous. It looks like if developers and designers are increasingly becoming a bunch of kids with delusions of grandeur and IQ around 50, living in total denial about reasonableness of their work.

Just imagine you're going to the newsstand to buy a newspaper, and instead you get a package containing blank sheets of paper, a microfilm scanner and a roll of microfilm, with instructions telling you to go home, scan the film and print the content yourself. You'd be on fire. You'd write to the publisher, asking them what the fuck they're doing. Even ignoring the additional time and money you'll have to spend printing the newsppaper yourself, it surely must be cheaper to print it all with industrial machines than making an equal amounts of goddamn microfilm and scanners for it! Economies of scale and stuff.

And yet this is normal on the web. Prepping content for client-side JS generation takes similar amounts of time (if not more) to just rendering the page on server. And then they have to send a) the content, and b) JS to render it, which takes more space than a rendered page, and then every client has to render it on its own. They're wasting electricity on every goddamn step of the pipeline, from servers through the network to the client. It's like electricity is free so they can all show everyone a giant fuck you, because hey, if it's free then let's waste it all.

Ok, I got carried away. But my point still stands. Dear JS-first website maker - I'm not willing to view your ads until you start covering my electricity bills for all the coal that got wasted on your idiotic client-side-rendering contraptions.


YES! There's no reason for websites that are just documents to not work with javascript disabled. For that matter there's no reason that they shouldn't work in text-based browsers (this one still might, since the opacity isn't going to matter in lynx). They work by default, and the only reason that they start not working is because people start adding too many gimmicks.

And with modern tools, there's no reason that anything — even a highly interactive app — couldn't be prerendered on the server.


I feel like I'm about to wander into a debate that's over my head, but I don't see why one computer (or one small group of computers) should be responsible for doing work that could easily be performed in a more distributed way by people's browsers, if only because it's cheaper to offload that work onto the client and leaves less ways to crash the server-side application.

I'm with you 100 percent when it comes to "I can't read this text document because it needs 45 JS libraries to render," that's stupid. But it's probably stupid because it's over-engineered, not because it uses the browser.

I feel like this is going to become a bigger and bigger debate in the coming years, and I'm eager to be proven wrong, or at least understand the other side. But if I'm building the next great web-based spreadsheets application (I'm not), my immediate and overwhelming success is going to be WAY easier to manage if the majority of my code is being executed on the millions of computers calling in to use it, not on the three servers running in my auto scaling group. Why WOULDN'T I pick that option?


> I don't see why one computer (or one small group of computers) should be responsible for doing work that could easily be performed in a more distributed way by people's browsers

For one reason, because that one computer can usually do it (or at least most of it) once, and instead you're forcing millions of people to redo the same computation themselves. That's just... wrong.

> But if I'm building the next great web-based spreadsheets application (...) Why WOULDN'T I pick that option?

Well, in this case this is the right way, because you need to dynamically interact with data entered by the user (in this case, people usually make another error - they send data to server that there is no need for; but that's another topic. [0]). You're writing a web application. But a web site, like landing pages, blogs with articles, etc. have exactly zero reasonable needs for rendering everything client-side. It's just making the same compute millions of times because someone was too lazy to compute it once.

[0] - actually, it's not. One could notice that most of the problems with current web come from two things: sending the code that should stay on the server to the client, and sending data that should stay with the client to a server.


I think we might agree with each other, but you are being more clear than I was. If we are looking at the web as the miraculous document-exchanging network that it was, yes, Javascript is may be literally ruining everything. If you can render something once, absolutely do it once, what a huge waste.

I was more disagreeing with the concluding assertion that "with modern tools, there's no reason that anything — even a highly interactive app — couldn't be prerendered on the server." There's no reason it couldn't be, sure, but it's way harder when the clients are just as capable -- and, once things start getting busy, probably even more capable.


Let me tell you a (probably-apocryphal) story:

There was a giant multinational hamburger chain where some bright MBA figured out that eliminating just three sesame seeds from a sesame-seed bun would be completely unnoticeable by anyone yet would save the company $126,000 per year. So they do it, and time passes, and another bushy-tailed MBA comes along, and does another study, and concludes that removing another five sesame seeds wouldn't hurt either, and would save even more money, and so on and so forth, every year or two, the new management trainee looking for ways to save money proposes removing a sesame seed or two, until eventually, they're shipping hamburger buns with exactly three sesame seeds artfully arranged in a triangle, and nobody buys their hamburgers any more.

(http://www.joelonsoftware.com/items/2007/09/11.html)


I think I see the connection, but I think it jumped over one piece of my argument (if my statement could even be called that): In the burger example, removing sesame seeds is an act that makes the product objectively worse, just in a way that will hopefully not make it worse enough to affect demand.

I don't think it's analogous to a developer taking advantage of modern consumer hardware to do the work computers were designed to do, because it's not necessarily a worse product you're delivering. If your internet connection is spotty, it's likely a better experience in some ways. It just seems like people have these brilliant machines capable of executing all of this code (more or less) out of the box, but we're treating them as thin clients because... why? Because some people won't upgrade their OS/browser? Because the method of delivery is the same one that's used to inject banner ads and flashy video ad thingies? If that's the case, then it seems like the whole "server-side rendering" option is getting to sound pretty good for advertising, too. Then where are we?


Let me make the connection a little more explicit:

In {the burger case}, {removing sesame seeds} is an act that makes {the product} objectively worse, just in a way that will hopefully not make it worse enough to affect demand.

And in {this case}, {punting things from server-side to client-side} is an act that makes {power usage, wear and tear, security, battery life, bandwidth use} objectively worse, just in a way that will hopefully not make it worse enough to affect demand.


Ah! Now I see, thank you. I think some of this is debatable, though:

* Security: Yes, there's not too much argument that can be made for unquestioningly allowing arbitrary code execution on your computer. I think part of this will improve with browser enhancements, so long as they go more in the direction of sandboxing than in the direction of "Hey let's give Chrome access to everything in your computer right?!" Doing important stuff server-side is probably safer for users. That said, I'm bad at security. I just don't know enough about it, so I can't put up much of a fight.

* Bandwidth use: I'd actually be really interested to see some research on this. Needing to download a different version of jQuery/AngularJS/Backbone for every website you visit is certainly not particularly efficient, but I wonder how long you need to be on a website before they've sent you more HTML data than you would have had to deal with pulling down the Javascript and just pushing JSON data into it. For the mobile web at least, you win.

* Regarding power usage and wear and tear, I don't consider it a crime to make a computer do computing, which may be our main point of disagreement. If I've got a 2+ gHz processor and a few gigs of RAM, what else am I going to be using it for? Almost all of my apps are web-based, and I'd rather hear my fan rev up a little bit while Firefox renders a big graph instead of waiting for a server to generate a picture of one and just send it to me.


> I wonder how long you need to be on a website before they've sent you more HTML data than you would have had to deal with pulling down the Javascript and just pushing JSON data into it. For the mobile web at least, you win.

I was talking to a young person that sometimes buys her mobile data plan in 20MB prepaid chunks, complaining how fast Facebook eats through that data.

Now I don't have Facebook (or a data plan, my ISP has "hotspots" all over town), I was curious and asked her how much Facebook does 20MB buy you, anyway?

She said sometimes when she does a full reload of the page it costs 3-4MB. I told her, did you know that's enough data to fit the entire LotR trilogy, or the bible? (either of them are about 3-5MB, zipped plaintext).

I agree that the things you say could potentially save a lot of bandwidth, but the reality is, in practice they really really do not, not by far :-)


> I'd rather hear my fan rev up a little bit while Firefox renders a big graph instead of waiting for a server to generate a picture of one and just send it to me.

You are not me. (Not to mention I am not against doing most things client-side. I am specificially against Turing-complete languages being required on the client side. Things like charts are fine.)

I do not see why things should be done multiple times when they can be done once. You're offloading visible costs to places where they are hidden and then calling it gone.


Thanks for humoring me, this has been really helpful.


I hate this story, because its clearly actually about people failing at designing studies, and everyone glosses over that.


Well yeah, you're right that you don't necessarily want every interactive app to use server rendering. TeMPOraL's reply gets the point across pretty well. What I was trying to convey is that with tools like react, where you can share your view code between the client and server, and do the same work on the client and server without doubling the required development effort, you can reasonably do it.

Whether or not to take this approach is to be determined on a case-by-case basis. Some tools are too interactive and have too tight a feedback loop to make any sense without javascript. A spreadsheet is an example where I'd probably choose to break support for no-javascript, but it's less clear-cut than, say, a CAD application. An example where I'd opt to support clients without javascript is a todo-list application, or a twitter-like.


> I don't see why one computer (or one small group of computers) should be responsible for doing work that could easily be performed in a more distributed way by people's browsers, if only because it's cheaper to offload that work onto the client and leaves less ways to crash the server-side application.

Of course, this has always already been the case! That's why we don't stream our webpages as images (or video, if you need to scroll), but instead offload the work of rendering a graphical representation of the symbolic HTML representation to the client-side browser.

It's a pretty solid idea (apart from the Browser Wars), and one of the ideas that made the WWW viable.

The ridiculous part, however, is where people somehow decide that we should wrap more and more and yet another layer of abstraction onto it.

It's not that web servers have grown less powerful and therefore need to offload more work onto their clients. The demands on web servers are higher today, but they've also grown more powerful, and we came up with some pretty clever scaling technologies to deal with those higher demands. But offloading code execution in the form of client-side JS is not really one of those, in practice.

Yes, the idea could technically be used to accomplish that and make less work on the server side, but really, take a look at this page, or any (bloated) page like the problem we're describing; That's not what's going on here, it's about the same amount of work on the server side, nothing's being offloaded, just more work on the client side.

At least the Flash-only-interface websites that plagued the web 10 years ago offered us UI elements that weren't possible in HTML back then (for better or worse), but today's JS bloat truly doesn't add anything that can't be done in a much leaner, less client resource-intensive way.

Even your spreadsheet example. Sure, run the calculations client-side, that only makes sense, no need to throw out JS with the bathwater. But what sort of calculations are we talking about, realistically? For a medium spreadsheet, a few hundreds of summations and multiplications. Nothing of the kind that should make a 5-year old computer even blink. But whatever these types of apps are actually doing they manage to crawl the jankscroll already even on an empty document, let alone when you try to use it.


It sounds like you're in favor of sending pdf's around. Or probably more accurately, large bitmaps.


PDFs are about the worst thing out there if you're frustrated about documents with embedded Turing-complete languages.

And large bitmaps are rather bandwidth inefficient, and inaccessible to boot.

I don't mind markup. I mind tech that doesn't include fallbacks, and I mind people using Turing-complete languages for things that can easily be done using a less powerful language. (Turing-complete languages tend to get abused by people to the point that people want them to be executed quickly and with little memory use (which means that the language implementation gets complex, and hence buggy {simple laws of probability}), and almost any vulnerability can generally be exploited by a suitable script in a TC language, and TC-languages can be used to do unexpected things.)

HTML is decent. HTML+CSS is worse than HTML (for the same reason that COMEFROM is worse than GOTO). Markup is better than HTML in many ways, or rather most formal specifications of Markup-like languages are better than HTML. (Unfortunately, most variants of Markup aren't specified beyond "do what I do")

I've been tempted for a while to write a Markup browser. (Mop? Markup-Over-tcP? )


So, I'm replying mainly to his assertion that any interactive app can be prerendered on the server given current technology. Given webapps span the range from "dynamic forms" to browser video games, I can think of no way of implementing the full breadth of what are modern web applications other than essentially video streaming and polling input on the client, which yes, is bandwidth intensive.

That's the entire reason why js and html and friends have become so popular. You essentially have full applications that have been developed by people who recognize that the bottleneck wrt to a networked application's responsiveness is not cpu cycles and memory in 2015 but internet latency. It's sufficient here to send a small text file and have the client fill in the gaps and have the client ajax for updates than have the server do the heavy work. I think you argued elsewhere, this is a waste of processor time redundantly rendering content, and thus, a waste of energy needlessly (as in real energy, from electrical into waste heat). As a tree hugger, I have to admit I find that argument somewhat compelling. I think a balance needs to be struck between user experience and use of resources. It might be your balance line lies further towards the "saving of resources" than mine's does.


Again, there is a (massive) distinction between Turing-complete languages and non Turing-complete ones in terms of how often they are exploited (and in terms of how severe said exploits tend to be). I don't mind non Turing-complete languages being run clientside, but do mind Turing-complete languages being run clientside, for mainly that reason. (Also, because Turing-complete languages tend to end up abused in terms of resource use).

And you can do most of a thin client, if not all, while keeping sane bandwidth use. (Note that almost all cases where bandwidth use would be excessive for a thin client in a web setting, said bandwidth use would also be excessive for a fat client in a web setting.)

It's mainly that current tech has settled on the brute-force approach of "let's stream every pixel and then try to compress it" as opposed to saner approaches (vector graphics, remote compositing, that sort of thing). But note that there are, for example, remote desktop protocols that work well (for most things) over a dial-up connection!


Umm... no. That sounds absolutely awful in several ways. I just believe the web is about choice, and users shouldn't be expected to use the exact setup that the service providers demand. You can avoid a huge class of vulnerabilities by disabling javascript, not to mention all the privacy benefits of doing so. With javascript disabled, all trackers can collect is a timestamped log of when you made what request — which is sensitive information, but not as bad as also having your cursor movements and scrolling and everything else logged. Disabling javascript can also massively improve performance.

That doesn't mean that websites shouldn't use javascript — I'd not want to use a todo list application that forces me to reload the page on every click — but it's still useful to support that functionality, because some day I might need to access my todos from an environment where I can't use javascript for whatever reason.

People claim that it's too much work to support something so niche, but if you architect your application well, using progressive enhancement, you get support for no-javascript environments, plus a ton of other stuff like server-side prerendering for latency reduction, accessibility, SEO, text-based browser functionality — all virtually for free.

Not that this should be applied everywhere dogmatically, it's not worth supporting no-javascript environments or scrapers in your browser-based photo editor.

It would be even better if servers would return raw data in xml format, with a linked xslt transform to convert it to a structured document, and css to manage the presentation with javascript to control the interactivity, but that idea died long ago, but some modern applications are finally replicating most of the good parts of that system.


I think you have something here. I'm only a noob web hacker, but from what I gather and read, I think that separating content from the application has been one of the things web developers strive for. It isn't always easy, however (which no, is not always a good excuse but is probably often used). This separation is essentially is what html, and even pdf, is supposed to be for.

I was more wondering what you meant by prerendering things, because even when one opens an html page in your browser which is on your local drive, the client is doing rendering (typesetting, and such). That is an extreme...so, it sounds like you were asking for a return to the pre-ajax internet, which you say here that it doesn't fit everything.

So I think we don't really disagree here, I could have misunderstood you?


On a related but no quite note, I've got a personal vendetta against sales pages that think buzzing and zipping jQuery is anywhere near to being a sensible idea.

<rantyrant> So from a personal standpoint, if I'm on your sales page, I want to know what you're offering, and why I should be intrested in what you are offering. I'd be fine with that being a plain html site with nothing but a few paragraphs and some bullet point lists. Maybe a table or two as well, and a link to somewhere I can give you money.

Now the last thing I need is for that part of the page to zoom in, wiggle around or do any other bogheaded things a designer with just enough jQuery to be dangerous could think of whilst they're being "creative" with it.

Now from a marketing perspective, this is incredibly stupid. A sales page is a machine for turning prospects into customers. There's a single design purpose if there is one. If someone come over and tried to paint flowers on the tip of my soldering iron, I would cut them. If someone decided to replace my PSU with a music box because the music box looks a lot better, I would not be pleased.

So why did it ever become a thing to do that to one of the most essential parts of your business? The part you point your advertising money at. The part that can very well be the difference between you going the way of the ipod or the way of the zune. Go nuts on your About page if you really have to keep your designer busy, have your faces zoom into it and spin and put the text into maquee tags and what have. But please, for the love of god, don't, don't ever do that to your sales page, because you're wasting my time and your money on that.

I blame marketing professionals with an IQ of 50 and designers with delusions of grandeur, CSS3 on the holster, jQuery in their hands, trigger happy on anything customer facing, spreading more $ signs through the site than the rap business.

</rant>


When the percentage of users using IE6 was lower than a few percentile, we started to ignore those users and came to expect that the web would be somewhat broken for them. Likewise, the web is becoming somewhat broken for users that have JavaScript turned off.

The separation of back-end and front-end, via the exposure of APIs on the backend, has led us to a world where the front-end requires a bit more thought, on how to represent and transform the data to render the content and theme correctly.

Why? Modularisation & de-coupling = greater maintainability + rule of single-responsibility is better applied. This will become more and more the case with the widespread adoption of microservices.

In this specific case, we could argue against the few lines of JS that provide a nice effect upon entering the page, but you can't argue about de-coupling and breaking down the mammoth codebase.


I'm not arguing about decoupling, exposure of APIs, etc. But decoupling doesn't mean you have to push all work onto the client. Have a backend server and a frontend server, the latter consuming APIs of the former and rendering content. You have to write that rendering code anyway, and opting to put it all in the browser is saving yourself a small electricity bill by passing it onto consumers, multiplying it millionfold on the way.

(Once again, I'm writing about web pages, not web applications (and no, a blog is not a web application).)

Modularization, decoupling, "rule of single-responsibility", etc. are meant to produce simpler, more maintainable code and better, slicker software. But instead, they're taken out of context and abused. Modern web codebases are getting more and more complicated, not less. What people are doing today is not good engineering. It's the architecture astronautics' equivalent of Kerbal Space Program.


I agree entirely with your frustrations about today's web, but don't forget that yesterday's web was filled with ridiculous Flash-only navigation sites.

There seems to be a force that always pushes technology to crud up with bogus features until the brink of usability.

Probably caused by the fact that developers tend to develop on above-average machines, and therefore will always target slightly above-average requirements, dragging everything upwards.


Author of the original article here.

I chose this template for my blog because I liked the clean, simple layout. I've been meaning to remove some of the JS functionality and fix some styling stuff for a while now. I didn't realize the fade-in thing rendered the site completely unviewable without JS. I'll be fixing it soon.


Thank you kindly!

And again, the fade-in (although I don't like said effects personally) isn't the problem. The problem is that said fade-in is done via JS with no fallback. (Note that you can generally do fade-ins via CSS, IIRC, or just add a noscript tag that overrides it to full opacity)

I suspect that there is other functionality that I'm missing without JS (No sidebar? Buttons to the side that do nothing?), but it's kind of hard to tell.


The template you have picked now is harder to read than the previous. With all the JS complains, etc., I never had a problem--and pretty much liked--the previous (NexT) theme.


Harder to read in what respect? The new theme is actually the one I had planned to use all along, I was just too lazy to make the necessary modifications for code highlighting which is why I went with the other one.


I find a good way to unfuck those sites is to use something like uMatrix or Web Developer to disable CSS (in uMatrix's case it can't disable inline CSS and needs a page reload, but of course it does a ton of other things and this probably isn't an as important use-case). Also works well when text doesn't properly wrap when the browser window is narrow or when broken menus and bars obscure the content, etc.


I find it rather... ironic, shall I say, when CSS actively hurts the readability of a site.

This being one of those cases.


In what way? I find this website perfectly readable, and the font size and line height were appropriate enough.


I was speaking in the context of with JS disabled.

There's a JS-based fade-in on page load (or at least there was; the author mentioned that it'd be changed at some point) that doesn't bother to have a fallback. And as such there's a CSS rule that sets the post opacity to zero (making it unreadable).

There are many many many websites that don't work without JS that work if you disable both JS and CSS. This being one of them.

That being said, even other than that there are a number of things I don't like about the website. All personal preference, however. (Namely: 85-85-85 font on 255-255-255 background isn't the best, contrast-wise (I have a laptop and am often outside - readability trumps fashion for me); I don't like the trend of narrower and narrower line widths (It looks bad enough on my laptop, on a 2k screen 700px width would just look ridiculous); Ditto, I don't like the trend of higher and higher line heights - 1.2 is plenty, 1.5 is overkill, this site is 1.7; Continuing on the trend of "why do you put so little on the screen", I find the paragraph separation also overkill (1 line between paragraphs or equivalent is fine, and maybe two between sections)))


> I don't like the trend of narrower and narrower line widths (It looks bad enough on my laptop, on a 2k screen 700px width would just look ridiculous);

The line length (measure) on that page is about 60 ems. This is much longer than the optimal value for readability, which is 30 - 40 ems.

> Ditto, I don't like the trend of higher and higher line heights - 1.2 is plenty, 1.5 is overkill, this site is 1.7;

As the measure increases, leading (line height) must also increase for legibility, because it's harder to track longer lines with the eye when 'returning' to the start of the next line.

A leading of 1.7 ems doesn't seem unreasonable to me for such a long measure.

For me, type on the web is a lot better than it used to be: tiny type on ridiculously long lines. There are some holdouts though; Hacker News for example.


Again: I prefer longer lines and shorter line height. I am aware that I am in the minority on this front.

I just wish that HTML had a sane way to allow client-side preferences (Like, say, line height and width) to be expressed.


1.2 is too little, 1.5 is just right.


+1

When I click a link and see no content, most often I close the tab and forget about it. Is it really that hard to display simple content without JS nowadays?


I saw this comment before reading the article completely, and thought it was about being expelled for using JS. I wouldn't be surprised if, to someone who knew only HTML, JS looked like "hacking".


oh come on.

It's 2015, not 1999. If you don't have javascript enabled, do you honestly expect to be able to read half of the internet?


For me, the utility of reading some random blog post that requires JS is far less than the potential consequences of enabling JS for every random blog post, given that most browser exploits require JS.

As such, when the only reason that a site requires JS to be readable is sloppy coding, yes, I call them out on it.

I enable JS only for things that require it for a good reason, and where I am reasonably sure that they won't be exploited. Doing a fancy fade-in on load (which, by the way, can easily be done accessibly) is not a good reason, and I am not reasonably sure that some random blog won't be exploited.


Even in 2015 I expect to be able to read text centric articles without javascript, on slow internet connections and or with low spec hardware. I don't have js disabled on my laptop, but i find it extremely annoying, that a large part of the “first world” internet is practically unusable if you don't own a current generation high end smart phone just because of unnecessary javascript shenanigans. I'm not referring to this site in particular, it works well after loading.


For privacy and security I do the same. Requiring javascript to just view a blog article is insane.


[flagged]


Please don't be personally abrasive in HN comments.

https://news.ycombinator.com/newsguidelines.html


"modern website"

The text is in plain format in the source file. What does the JS add besides actually showing the text. Nothing.

In 1999 the Web had aligned text, fonts, and hyperlinks. In 2015 we need JS to do this!?


Agreed.

As I said: it's a trivial fix.


> There's something to be said for supporting accessibility, and I completely endorse that, but if you turn javascript off because you want to, don't complain when you can't read or use a modern website.

> The web uses javascript. Get used to it.

There's absolutely no reason to require the execution of potentially malicious code in order to read an article. Yes, web apps require JavaScript (and I don't use very many of them, for precisely that reason), but a web page, that is a resource fetched via HTTP (the hypertext transfer protocol) and encoded in HTML (hypertext markup language), simply doesn't and shouldn't.

The Internet is about communication; it's about the exchange of information.


Don't be myopic: accessibility standards do not require JS.


It's 2015, not 1999. If you don't have javascript enabled, do you honestly expect to be able to read half of the internet?

Yes.


too bad I guess. /shrug


Too bad? The overwhelming majority of pages work without javascript.


> It's 2015, not 1999. If you don't have javascript enabled, do you honestly expect to be able to read half of the internet?

Yes, I expect to be able to read data without having to execute code.

And, FWIW, the Internet was pretty great in 1999. In many ways (but not all, of course) it was better than what we're stuck with now.


>It's 2015, not 1999

That gives you 16 years to learn how to make a simple, efficient, portable web page.


Disabling javascript makes Chromium eat less memory and makes pages load faster. Also all sort of annoying popups, effects, ads and social tracking buttons do not work without it while page content is usually readable. That's why I have JS enabled only for trusted websites like Youtube and disabled everywhere else. It just makes browsing the web more comfortable.

It's sad that Chromium doesn't have a way to disable CSS on a per-domain basis (and browser extensions are potential backdoors so I'm not going to install them).


> trusted websites like YouTube

...


?


Completely sidestepping the question of if the grandparent comment is correct, they are trying to imply that YouTube shouldn't be trusted.


I use NoScript, and allow only the scripts that I need to see what I want on sites. All trackers are blacklisted.


When I alt tabbed to the site there was a slow fade-in on the menu.


Sidestepping the question of if that actually adds anything to the article - they couldn't be bothered to do the fade-in correctly?


Isn't the effect identically achievable using the `transform` css property?

Why would anyone use JS in this case?


I've become addicted to the reader mode button in Firefox. Allowed me to totally ignore it and keep JS disabled. There are quite a few sites I don't even try to view in Chrome anymore, I just drop them into Firefox and use the reader button.


Whilst I totally agree with you in general about the javascript abuse issue and about websites vs webapps... Check out Firefox reader mode.




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: