
In Defense of the Modern Web - jdkoeck
https://dev.to/richharris/in-defense-of-the-modern-web-2nia
======
filleduchaos
I was disappointed to find that this post falls into the same pit a number of
these defences of the modern web fall into - creating a false dichotomy
between "literally no JavaScript whatsoever" and "a framework-authored site
with client-side routing". None of the features mentioned in that particular
paragraph - user interaction analytics, preloading data and content [on link
hover], transitions, avoiding full page loads - _need_ a site to be an SPA,
nor are they "fiendishly difficult" to do for the [near-]static sites
MacWright discusses in his blog post.

I think it's a shame that frontend web developers have thrown themselves
behind the SPA paradigm so hard that the notion of a JavaScript ecosystem that
doesn't try to do pull _everything_ into its own domain is unimaginable. I'd
love to see more mindshare go towards building libraries that are drop-
in/"just add water" progressive enhancements for static sites.

[edit: minor grammar correction]

~~~
snazz
As discussed on a recent thread
([https://news.ycombinator.com/item?id=23137324](https://news.ycombinator.com/item?id=23137324)),
there are a variety of nice frontend tricks to make a server-rendered site
feel nicer (LiveView, TurboLinks, Livewire, instant.page).

I'm a fan of these kinds of solutions. For my 100% static website,
[https://instant.page](https://instant.page) does make it feel instant.
TurboLinks feels a little "heavier" and has a more complex interaction with
other frontend code, but it works very well.

~~~
basch
Another is: DerbyJS, Racer, and ShareDB who have been fighting for this future
for what feels like forever, but meteorjs just torched it from a popularity
perspective, in the the event-driven server-side rendering realm. Despite
giving a much worse end user experience, it was more popular with developers.
Is Lever really the only company using Derby?

[https://github.com/derbyjs/derby](https://github.com/derbyjs/derby)

[https://github.com/derbyjs/racer](https://github.com/derbyjs/racer)

[https://github.com/share/sharedb](https://github.com/share/sharedb)

[https://derbyjs.com/](https://derbyjs.com/)

~~~
enahs-sf
makes me harken for the good old days of backbone.js. everything you need,
nothing that you don't.

~~~
basch
I think its kinda great that you can just use ShareDB with whatever else you
use, or also use racer, or go full blown framework. The way its abstracted
gives you the power to choose more or less.

------
MrRadar
> When I tap on a link on Tom's JS-free website, the browser first waits to
> confirm that it was a tap and not a brush/swipe, then makes a request, and
> then we have to wait for the response. With a framework-authored site with
> client-side routing, we can start to do more interesting things. We can make
> informed guesses based on analytics about which things the user is likely to
> interact with and preload the logic and data for them. We can kick off
> requests as soon as the user first touches (or hovers) the link instead of
> waiting for confirmation of a tap — worst case scenario, we've loaded some
> stuff that will be useful later if they do tap on it.

How many sites _actually_ do this? And how much additional bandwidth (and
battery life for mobile devices) does doing this consume?

> We can provide better visual feedback that loading is taking place and a
> transition is about to occur. And we don't need to load the entire contents
> of the page — often, we can make do with a small bit of JSON because we
> already have the JavaScript for the page.

In my experience sites that do this are almost universally slower and less
responsive than sites that just use normal links. Maybe that's just
confirmation bias and I just don't notice it as much on sites that do it well,
but a particularly egregious example of this is Reddit's current mobile site
vs their old i.reddit.com mobile site (which I use exclusively (despite it
missing a number of newer features and having a lot of minor bugs the newer
site doesn't) because it's way _way_ faster).

~~~
Isomorpheus
> Maybe that's just confirmation bias and I just don't notice it as much on
> sites that do it well

This seems to me to be the key phrase.

I sympathize with all the "JS has gone too far" people in this thread. I also
_hate_ bloated JS apps. But, "almost all SPAs are bloated trash" is a distinct
claim from "SPAs are intrinsically trash".

In my experience it is possible to have lean, high-performance, accessible
SPAs without too much developer difficulty. Svelte (created by the author of
this article) is one framework in this space, although I think there are
better approaches.

~~~
rephrase
Do you know of good SPAs? Sites that feel snappy even though they're SPAs.
Most sites I visit that are SPAs are consistently slow so I've learned to make
the same association: SPA = bad.

~~~
evilduck
We've been experiencing sluggish pages since long before JS really became a
thing too. PHP and a slow inline query to render results has been a possible
problem for literally decades.

Do you investigate all the fast snappy sites you visit to confirm they aren't
single page apps? What about the slow pages that aren't SPAs (most every news
site)?

It seems like anyone who wants to dislike JS and/or SPAs can easily
confirmation-bias their way into believing it.

~~~
rephrase
That's why I asked for examples. I want to know what are the good examples.
Someone mentioned the main react site. Seems like a good place to start.

------
grey-area
This author seems to think the only competitor to react is other js
frameworks, when in fact most of the web's largest sites work just fine with
small amounts of what he calls 'artisanal js'.

You probably don't need your js framework, your 10k npm dependencies, or your
complex mix of server and client side rendering. Not every website needs an
api and clients to consume it, and not everything needs to be an app.

The web is a success because it is simple, accessible, fast, and nobody cares
how the server makes the html they are reading. Let's keep it that way.

~~~
rich_harris
> your 10k npm dependencies

Believe me, you're preaching to the choir. Nor am I advocating for every
website to have an API. But unfortunately we part ways here:

> The web is a success

The web is _not_ a success. It's dying. Consumers vastly prefer native apps —
one recent study ([https://www.mobiloud.com/blog/mobile-apps-vs-the-mobile-
web/](https://www.mobiloud.com/blog/mobile-apps-vs-the-mobile-web/)) tells us
that 90% of mobile time is spent in apps vs 10% in browsers.

~~~
amw-zero
Well, I think this is really hyperbolic. I think it depends on the
application, and the consumer. And the web is a tremendous success. You should
take a step back and look at how your entire life is affected by the web. It
seems really, really off the mark to say that the web is not a success.

You can criticize it, sure. But - you are using a computer 100% because of the
web, not because of spreadsheet software.

Now, back to web vs. native. You just have people who prefer either. I
exclusively use apps. I think the experience is better in every way, and I've
never seen a web application deliver comparable experience to a native app.
But, almost every native app is backed by a web server. And, some things I do
want to remain web apps, like Wikipedia.

I think the web community overall looks at native apps like they are some
weird, alien technology because they're completely blinded by what they build
and what tools they use. Like always, if we understood why users enjoy native
apps, we could learn a lot of things. I'd argue that we are learning that,
which is why there are lots of SPAs. That doesn't mean that's the only way to
build software, it just means that people do actually enjoy that experience.

~~~
Guidii
"I exclusively use apps."

Curious. Did you post this using a native app?

~~~
ric2b
Not the person you're responding to but there are plenty of HN native app
clients, if that's what you're getting at.

On my mobile I never access HN via the website, I use one of those apps.

------
timw4mail
I think one nuance is lost on both articles: web-development is generally
stuck in a system where all the incentives are wrong.

\- The developers want to prove themselves with a new technology

\- The site owner wants to make money, so adds are required

\- The website needs to "verify identity" to cut down on fraud, so invasive
trackers are added

\- The finance team wants to cut down on infrastructure costs, so they want
less done on the servers

\- The user wants to do what they came for done, but may be frustrated by how
the site slows down the browser

This tug-of-war is a lot of what leads to slow, resource-intensive, privacy-
invading websites.

~~~
duxup
I think that first item is true about ... a lot of careers. Resume driven
development isn't an exclusively web thing.

~~~
karatestomp
I think it's more prominent and damaging there on account of all the churn,
combined with large amounts of stupid money.

Mobile native developer? Might be something new you want to get familiar with
every couple years, but really you'll likely be OK, career-wise, if you sleep
on it for another year or two unless you actually need it. C++ developers
aren't chasing brand-new frameworks to stay hirable, generally. Java and C#,
most of the ads seem to be "have experience with [eight-year-old 'new' thing
that's basically an ecosystem-wide standard now anyway], plus a bunch of much
older things", and those two alone are a _really_ high percentage of all
software jobs.

------
karatestomp
I just know the web is very hard for my kids to use, and the more app-like the
page the harder it is. Watch them use a page for a minute and they’ll remind
you of two or three janky, shitty, or counter-intuitive things about web
interfaces that you’ve become blind to, through familiarity.

Native’s much more likely to be ok.

Chief problems are: web interface design tends to be much worse than native
(even if they look nice on a screenshot), UI latency is bad and gets worse the
more JS you throw at it, there’s more likely to be unhelpful functionality the
developer’s accidentally created (selecting the text on a “button” rather then
clicking it), and they’re more likely to have weird state-related fragility.

------
gambler
_> With a framework-authored site with client-side routing, we can start to do
more interesting things. We can make informed guesses based on analytics about
which things the user is likely to interact with and preload the logic and
data for them. We can kick off requests as soon as the user first touches (or
hovers) the link instead of waiting for confirmation of a tap — worst case
scenario, we've loaded some stuff that will be useful later if they do tap on
it. We can provide better visual feedback that loading is taking place and a
transition is about to occur. And we don't need to load the entire contents of
the page — often, we can make do with a small bit of JSON because we already
have the JavaScript for the page. This stuff gets fiendishly difficult to do
by hand._

No, it's not. It's easy and you can do all that without an SPA framework. If
you can't see this in about 5 seconds, you shouldn't be lecturing other about
how to design websites.

Preloading hovered links can be done with a standalone library. AJAX can be
done with another a standalone library. Removing extra crap from your page to
facilitate faster AJAX can be done with a custom request header and a couple
if-else statements on the server side. Best of all, this stuff can be added
_after_ you have a working website.

~~~
Izkata
> Preloading hovered links can be done with a standalone library.

Couldn't it even be done in two lines with vanilla javascript? On hover
fetch() the URL, then rely on the browser cache for when the user actually
clicks?

~~~
Spivak
And isn't this what browsers do already with prefetch?

------
quantummkv
> When I tap on a link on Tom's JS-free website, the browser first waits to
> confirm that it was a tap and not a brush/swipe, then makes a request, and
> then we have to wait for the response. With a framework-authored site with
> client-side routing, we can start to do more interesting things. We can make
> informed guesses based on analytics about which things the user is likely to
> interact with and preload the logic and data for them. We can kick off
> requests as soon as the user first touches (or hovers) the link instead of
> waiting for confirmation of a tap — worst case scenario, we've loaded some
> stuff that will be useful later if they do tap on it.

What? The solution to making a website fast is to crunch analytics (presumably
on the client and in JS, all of which has to be loaded beforehand for this to
work) and then try to predict what the user is going to do on the website,
anticipate it, and start loading, even when it might go waste?

Wastage of network bandwidth aside, Speculative Execution destroyed the chip
performance in the last two years and opened up a whole lot of scary security
holes, some of which may never be fully patched without performance hits. And
the bright idea is to do that in the browser?

~~~
the_gastropod
Yes, this. I about fell out of my chair reading this paragraph.

In these discussions, I find defendants of these SPA designs to almost always
point to vague points about "rich UI" or "highly interactive", or make up
complete hypotheticals like this example of pre-loading _likely-to-be-clicked_
links. Show me the code. Show me the crazy-fast link-click-predicting website
that does this and loads faster than, say, HackerNews.

It's good to see the pendulum swinging back to some semblance of sanity on
this topic. The past ~5-10 years of the SPA being the default choice has been
a wild time. I couldn't be happier that it's starting to receive some
widespread criticism.

~~~
rich_harris
> Show me the crazy-fast link-click-predicting website that does this and
> loads faster than, say, HackerNews.

[https://hn.svelte.dev/](https://hn.svelte.dev/) is a Sapper implementation of
Hacker News. It uses the preload-on-hover/touch technique described in the
article. In my experience it does indeed feel faster than Hacker News, despite
basically being something I threw together one weekend

~~~
the_gastropod
Credit where credit is due: this is excellent work! Very impressed.

------
echelon
> The fact that we can do server-side rendering and communicate with databases
> and what-have-you using a language native to the web is a wonderful
> development.

The author isn't just a React apologist, but also seems to want javascript
everywhere.

So much of this debate - sever side rendering, etc. - has to do with the
inefficiency of DOM. Mobile and desktop UI toolkits aren't DOM for a reason.
They do application UI and have application concerns, meanwhile JS is sitting
in this weird Frankenstein ducktaped documents-and-apps markup world. An
ecosystem that has to support a million (often competing) things, but all
slowly, and with decades of legacy we can't break.

It's going to be interesting change of landscape when we have isomorphic
Rust/Wasm delivering truly native web apps that don't render on top of the 30
years of DOM kludge.

The Rust community is just getting started and hasn't had a long time at this,
but there's already some really impressive UI work being done in the browser
with Rust/Wasm [1]. Once Rust frontend frameworks begin to arrive, it'll be a
whole new ballgame. It's so damned fast.

[1] [http://makepad.nl/](http://makepad.nl/)

~~~
timw4mail
I don't think the DOM is the bottleneck as much as JavaScript is.

~~~
realharo
Based on what?

~~~
timw4mail
JavaScript downloading and parsing is much more computationally expensive than
html parsing.

The DOM seems slow, I think mostly because it requires translating between
rendering systems - the representation of the html has to be updated, the
styling has to be applied, and the change has to be painted on screen. This
explains why updating a documentFragment as a batch operation is much faster
than changing the dom directly in a loop.

------
fleddr
Another angle I'd like to add to this discussion is something that gets near
zero attention: longevity.

Many SPA developers rapidly move from project to project. Every once in a
while, they learn yet another JS framework, and then hop between jobs to use
that.

They never look back. They never truly need to take responsibility for a tech
choice.

Try to hire an Angular 1 developer to work on 300K lines of code today. You
can't. No skilled front-end developer would take on such a job with options to
work on hotter tech.

Any SPA choice you make today will be completely obsolete in 3-5 years. 3-5
years, or even 10 years or more, is a normal life cycle for a serious web
property. And not just obsolete, you'd have the hardest time getting the tool
chain to still work at all.

The original developers don't care, they using tech N+5 now, somewhere else.
They'll be very vocal how N+5 is vastly better than anything else, and really
the only option for "the modern web".

~~~
eximius
This is just the COBOL problem in newer clothes. Raise the pay and someone
will decide it's worth their time.

~~~
fleddr
Sure, but timelines are different.

Angular becomes a front-end COBOL in just a few years, not 3 or 4 decades.

There's more. A typical front-end architecture depends on a complex interwoven
tool chain with a lifespan of at best years, sometimes even months.

Same for modules. It's normalized to have an absurd dependency tree of
hundreds to thousands of inter-depending modules. The vast majority of which
will be abandoned in a few years.

------
simonw
Genuine question: are smooth page transitions really worth all of this?

It seems that page transitions really are the main reason given for using
JavaScript-everything for sites and simple applications that otherwise could
have been built with some fast loading HTML and a couple of POST forms.

~~~
fleddr
No they are not. Well, first of all, most SPAs don't actually deliver on this
promise anyway.

But let's look into the point made by the author regarding the inspiring
example of Apple's app store UX, and trying to aim for that.

We, meaning as an industry average, will not achieve that level of UX no
matter the stack. Because nobody can afford it. It requires a team of top
designers, interaction engineers, dev, QA, etc.

That's the problem with the SC valley mindset. They assume every web property
has this huge team of top engineers behind them, and that this is a status
quo, and applies to the whole web.

They have no idea what bubble they are in.

------
Wowfunhappy
> Web developers are currently trapped in a mindset of discrete pages with
> jarring transitions — click a link, see the entire page get replaced whether
> through client-side routing or a page reload — while native app developers
> are thinking on another level.

In native apps, I'm a sucker for pretty transition animations, and I'd love to
see those on the web.

But, every website I've ever seen with those types of fancy transitions for
large pieces of content (as opposed to little on-page microinteractions) also
feels slow, clunky, and heavy. As such, I've come to associate such
transitions with slow, clunky and heavy web pages.

I'm not really a web developer, so feel free to tell me if everything that
follows is stupid: It seems unlikely that our "current trajectory" can ever
resolve this problem. Instead, we'd need an underlying technological shift in
_how browsers work_.

Most large transitions are basically browser hacks—albeit ones we've used so
frequently, and for such a long time, that they've come to be officially
supported and accepted. To make them more usable, they'd probably need to work
_alongside_ the browser in some manner.

~~~
ric2b
That's probably because those animations are made with JS in the main thread,
so the whole thing becomes unresponsive while they happen.

------
factorialboy
The criticism isn't directed towards technology X or platform Y. The target is
over-engineering.

And JavaScript has laid claim to that throne which was once primarily
contested by enterprise Java and .Net.

~~~
Frost1x
I think it's a combination of both over-engineering and poor engineering
leading to unnecessary and unmanageable complexity.

Some project starts and they decide to use an uncessarily complex framework
(which may be well engineered for the cases it was developed for at the
Facebook/Google scales of the world) then grab 3 more and glue them together
using poor practices, then proceed to build all sorts of complexity ontop of
that stack using poor engineering and typically non-standard practices.

New people join and voila, they immediately need to understand these 3-4
underlying complex yet well engineered systems and how they are poorly glued
together before they can even begin to unravel the poor engineering practices
that sit ontop of them so they can fix and extend them.

Parts of the application look well structured, other parts are incredibly
confusing. Suddenly you need to go back and better understand some of the
underlying frameworks leveraged so you can actually discern what's supposed to
be going on which takes forever because that underlying framework is quite
complex in and of itself.

In the end you're running around documentation page to documentation page from
frameworks and looking at undocumented code someone wrote in between,
consuming stupid amounts of time.

Someone built a big pile of technical debt and you just inherited it. Good
luck, they've switched to another team, project, or employer and moved on
before the mess unfolded.

Meanwhile, you're wondering why 90% of this exists for a product feedback
submission form.

------
zeveb
All I really want from the web is to get text, images and data and submit form
data. I don't really _want_ interactive applications which are almost but not
quite entirely unlike proper native apps. I just want to read interesting
people and submit my own thoughts in return. I want to see pictures & text
about things I want to buy, and submit my address & payment information to buy
them.

The modern web delivers some of that, but only accidentally. The modern web is
all about delivering potential malware to me and expecting me to run megabytes
and gigabytes of this code, all in order to … display pictures and text,
something the browser has been capable of for over twenty years.

~~~
hombre_fatal
You have control over your browser. Just disable Javascript. Some websites
don't work? So what?

Now compare that to the rest of software where you don't have that control and
you'll see software has always been in the state of your hyperbole about
pushing malware to your machine that can always be making http requests that
you can't see.

It's a bit sad that even technical people are only waking up to this when it
comes to the web because they can finally see it with the browser's network
tab, so they think it's just a web problem and begin advocating for throwing
the baby out.

------
fenwick67
Every time a web designer complains that a full-page refresh is "jarring", a
bit of me dies.

This is like saying a lightbulb turning on quickly is jarring. It's the
expected, immediate behavior. If every lightbulb I used turned on with
different durations and easing functions or flashed colors at me, or loaded a
flashing placeholder while the lightbulb loaded, that would be jarring.

~~~
amw-zero
Why do you assume that it's just web designers? Have you ever shown a web app
vs. a native app to someone who doesn't write code? Or just someone who has
design sensibility?

The lightbulb example is also not a good one. A lightbulb transitions between
two states. An interactive application transitions between hundreds of states.
Full page refreshes make each transition clunky and noticeable. Only
programmers look at it and say "yea, that's fine," because we know how
difficult a web browser is to implement. But end users don't care about
difficulty. They care about the end result.

~~~
fenwick67
I don't really have a problem with web apps doing incremental loading when it
makes sense (Facebook, for example, or letting the user add a comment in
situ), but many many sites would be better off just loading documents instead
of moving to an application framework. Take for instance TechCrunch.com vs
Wired.com, Wired is much easier to use because it acts like a website. Yes, it
does full page refreshes instead of showing a full-page loading spinner, but
it behaves more consistently and doesn't require the user to learn new UI
paradigms.

~~~
amw-zero
What about Google Sheets? It would be pretty strange to input a number into a
cell and have the page refresh to perform a calculation no? Frankly I'm not
sure how any application falls into the document paradigm. It was created for
linking documents of text, not interactive applications.

------
anderspitman
I don't have a problem with the "modern web". I love all the new technologies.
But I think we should re-split the document web from the application web. A
huge percentage of web content can be implemented with a subset of HTML and
CSS.

A browser that implemented a few semantic tags (including <img> and <video>),
fonts, colors, and flexbox would be useful, secure, fast (even on slow
devices/connections), and far easier to implement than the behemoths we have
now.

Let the application web keep evolving, but specify a well-defined subset for
sites that don't need all the features.

------
buboard
I needed to read this after fighting with Upwork (an apparently Angular
website, which i found out via necessity, trying to figure out if my payment
was being processed, double-billed, or not processing at all). Here is another
primary example of a victim of the modern web, a website that needs none of
it, yet had to be converted to one of the 'constantly spinning wheels on 20MB
pages' framework. Thanks, modern web for slowly destroying every service that
i use. At this rate, by next year the web will grind to a complete halt.

------
fleddr
Like most deep in the SPA bubble, this author cannot imagine any other outcome
than SPA-everything. Meanwhile, any alternative view is put aside as being an
anti-JS crusader (which misses the point entirely), simply being old, or even
barbaric whilst he continues to high five his bubble inhabitants on Twitter.

SPA as the status quo is a myth. It's a Fermi paradox. Where are they? Almost
nothing of any importance is an SPA. You'd think that if they have such
superior experience, they lead to superior business metrics, thus they are
everywhere. Asides from true app-like experiences like Gmail, Spotify, the
like, they're nowhere to be found. Not in places of significance. It's 2020
and Amazon continues to be old school page refreshes. Last time I checked
they're doing quite OK. Even big parts of Facebook, the poster child of
complex interactivity, continues to do full page refreshes in big parts of the
experience.

Instead, SPAs are a startup default, not a web default. Big difference. And
the way most are implemented, they don't live up to their potential benefits,
instead deliver UX that is even worse than static:

\- breaking back buttons \- breaking scrollbars \- rendering things that
aren't interactive

These things don't have to occur, they can be solved in an SPA, but they do
occur a lot in the wild. This matters. We live in the real world.

As somebody else here already mentioned, the article takes a completely wrong
approach. It assumes some binary choice or outcome. The original point instead
was that the web is not a single thing. It's a huge thing where different
stacks are to be used for different use cases. This point is largely ignored,
instead it is implied that SPAs should simply solve such use cases better,
whilst continuing to be a default.

On the particular point of SPA links being typically faster/smarter: there's
about 500 drop-in solutions to speed up static links. Which are usually not
needed, because the point of such links being bad is overstated and lacks any
user evidence. See my earlier remark on virtually all major websites using
plain old links. They wouldn't if it was so bad.

Finally, a huge part lacking in this discussion is the accessibility of web
development itself. The SPA web is an engineers web, it requires advanced
programming skills, infrastructure, increasingly complex tools chains, and so
on. If this was to be a true default (luckily it isn't), it would lock out the
vast majority of people wanting to contribute to the web.

It's sad to see this sub culture of hardcore engineers with Macbook Pros
thinking they own or represent the web or web development.

~~~
robertoandred
The idea that some bad implementations of an idea serve as an indictment of
the idea as a whole is utterly ridiculous.

Some images on the web are poorly optimized, but that doesn't mean we should
not have images.

~~~
fleddr
No, it's not ridiculous, it's sane.

The whole point of a SPA is to give a user a _better_ experience. If on
average this does not materialize in the real world, and often achieves the
exact opposite, the entire concept fails.

Bad SPA implementations _are_ the standard. Saying that it's possible to make
a good SPA implementation doesn't change the real world outcome that most
often, they don't deliver.

------
cptskippy
I'm having a hard time comprehending this mindset.

According to this article we should accept:

\- Heavy sites that take a long time to load initially.

\- Are sluggish and unresponsive when not on this year's flagship phone.

\- Will probably break if the user leaves a tab open for too long.

Because:

\- There's an opportunity to shave a fraction of a second off of your page
load times if you want to utilize predictive analytics or extensive preloading
to ensure the content request is already loaded in cache.

The modern web is just layers upon layers of hacks and it's ugly and slow.

I would recommend to anyone who disagrees to install the NoScript Add-on and
run with it for 6 months. I did so and here's some things I've observed...

When I enabled NoScript:

\- Most sites do not work without some Javascript that's been used to address
Page reflow. They have a global display:none attribute that's removed after
the entire page has loaded. All this discussion about page load times or
responsiveness is BS when sites are hiding their site from you until it
completely loads to avoid reflow.

\- Most sites rely heavily on Javascript to do basic formatting that's trivial
in CSS.

\- Most sites use Webfonts loaded via Javascript

When I disabled NoScript:

\- My Phone went from 60% battery at the end of the day to <20%.

\- The Web is slow again.

\- User hostile behavior like pop overs, auto-play video, and other garbage.

~~~
rich_harris
> According to this article we should accept

Well, I wrote the article, so I can say with some authority that you've badly
misinterpreted it! Somehow I manage to build sites with exceptional
performance that are snappier than the equivalent JS-free sites, using the
techniques I talk about (and [https://svelte.dev](https://svelte.dev)).

Yes, there are a lot of bad sites out there that abuse JavaScript. No, that
doesn't invalidate the article's thesis; it supports it.

~~~
cptskippy
> Well, I wrote the article, so I can say with some authority that you've
> badly misinterpreted it!

Perhaps but point from Tom MacWright's article is that we're over complicating
simple things (e.g blogs) with complicated and heavy libraries and frameworks.
Then when we encounter performance issues we attempt to mitigate them by
layering on additional crap or spending significant time optimizing the crap
out of needless complexity.

Your suggestion that full page loads are unnecessarily slow and cumbersome and
that we can use predictive analytics, preloading and a slew of other
techniques to get a faster response to a page load is the equivalent of
building a Rube Goldberg machine to turn the page of a book "that much
faster". Sure your techniques address one metric (i.e. response time) but they
introduce all manner of other problems that then need to then be solved. For
what?

> Somehow I manage to build sites with exceptional performance that are
> snappier than the equivalent JS-free sites.

Somehow you manage demonstrate that you completely misunderstood what Tom
MacWright was saying. Back to my page turning analogy, I have no doubt someone
could build a machine to turn the pages of a book "better" than I can. Does
that mean everyone should have this miraculous page turning machine?

At the end of the day my local pizzeria doesn't need a multi-megabyte SPA
application utilizing client side ML and predictive analytics for me to get
the phone number so I can call in an order, but that's the modern web.

------
thejynxed
From visiting tons of different sites that have decided to "modernize", I can
assure you there is no defense for how god-awful the "modern web" actually is.

------
combatentropy
I wonder if a lot of the disagreement is just people talking past each other.

I used to think it was an oversimplification to divide the web into Documents
and Applications. I thought it was more accurate to say it was a continuum,
and using two different techniques was a sign that we still had not figured it
out. After all, a blog post about programming might have an interactive
example. An article about retirement might embed a calculator. If you try to
categorize a site or even just a page as Document or Application, it gets
fuzzy. But if you look at a page and just try to categorize it section by
section, as Document or Application, it's easier. (Hence, Web Components make
more sense to me.)

So anyway, most people spend some time surfing the web --- by that I mean
visiting Documents. Clicking links from search results, to articles, to more
articles. If you aren't using an adblocker, I can't imagine how you don't end
up throwing your laptop against the wall. I ended up just turning off
JavaScript for most sites, and the web looks more like how I think it should
look. Every once in a while these people will post a rant, and it will
resonate with most people.

But then an Application Developer will read it after a long day and say, hey,
wait are you saying I should write my Application (like GMail) in jQuery? To
be honest, Applications don't bother me the way overscripted Documents do. I
almost expect them to be kind of slow. After all, even some Desktop
Applications still have loading screens. So I don't mind so much if you're
using React or whatever for your Application or even for that calculator
embedded in the article about savings.

But for Documents --- like articles, blog posts, company home pages --- well,
one thing I agree with SPA developers is that there could be less server-side
rendering. These kinds of sites could often be compiled ahead of time into
static HTML. Server-side rendering is better than SPA for these kinds of
sites, but static sites are better than server side. I mean, who enjoys
getting "Error connecting to MySQL database"? (Okay, that's rare.) I'm
intrigued by the old-new JAM Stack :)

------
csande17
> I'm not aware of any other platform where you're expected to write the logic
> for your initial render using a different set of technologies than the logic
> for subsequent interactions. The very idea sounds daft.

It's not a perfect parallel, but this reminds me of systems like Qt Designer,
or Apple's Interface Builder/Storyboard, or Android's XML layout system. Those
systems all essentially give you an "initial render" made with one technology
(XML files created in some sort of visual GUI-builder program) and "subsequent
interactions" made with another (C++/Swift/Java code).

------
ecmascript
Interesting take and I kind of agree.

My take is, however, that the web as we know it is going to be split in apps
and websites.

Sites is stuff that works fine with an 'old classy' request->response method,
stuff like blogs, newssites, wikis, information sites, simple booking sites
etc. You can make a great living just doing that. Probably most content
available on the web will fit in this category. However I think Wordpress and
similar frameworks / libraries will continue to dominate this field.

Apps is stuff where you practically need a SPA, where there is no choice.
Things like music players, photo editors, file managers, chat applications
etc.

Old mom n pop shops will still be around and never need a SPA, but a lot of
applications being ported to the web do need it. If you do old style websites,
there is nothing wrong with that and it still works fine for a lot of stuff.
But if offline capability is a requirement (as an example), it is very hard to
make a compelling case without making a SPA.

I like both, I find the old style apps to be a lot quicker to develop but the
SPA, PWA style app gives the user a much native feel which is important for a
lot of apps.

~~~
tenaciousDaniel
I'd really love to see an entirely new rendering context for the web. Right
now you have to go WASM > JS > DOM. What if there was something else, like AOM
(application object model) that gave direct access to WASM. It would allow the
DOM to go back to what it was originally intended for - interactive documents.

~~~
giantrobot
`<canvas>`

~~~
tenaciousDaniel
eh, not really. It's true that Canvas is its own rendering context, but it's
just a drawing API. There is a whole suite of technology needed for
applications that is currently only available to html/js. Things like
accessibility, text input, etc. What I'm imagining is a rendering context
that, like the html document, provides those things out of the box.

------
luord
Interesting that he agrees with the point of the original article I disagree
with the most, the one on APIs. Maybe I'm wrong...

Speaking of which...

> I'm not aware of any other platform where you're expected to write the logic
> for your initial render using a different set of technologies than the logic
> for subsequent interactions.

I'm not aware of the backend, the server side of many mobile application being
written in kotlin (or java) or swift.

In fact, it's common for the server and the client to be written in different
technologies.

Yes, I know that's not exactly what he means and in a native application, the
application is in charge of the initial render. But he's criticizing APIs and
I don't see how this statement can be possible without APIs unless he expects
the backend to be in the same technology. Which, again, is not unusual for it
not to be so.

------
cocktailpeanuts
What we have today is a POST modernism web. Nothing about it is modern.

------
revskill
I think SPA is just one of results created by libraries/frameworks like React,
Vue,.... Those things were created not to produce SPA (we can do with vanilla
JS), it's for modularization and componentation.

Or we can say, they are implementations on how to achieve the same thing. The
difference is performance.

------
theandrewbailey
I hope that someday, more devs/designers will realize that adding more
Javascript to change the web's document centric nature doesn't work, and just
makes things worse. If you want an app, make an app, not a website.

------
coding123
SPAs are a good thing for Apps. Back-office, etc... all great things here.

SPA tech is a good thing for web sites that have little appy parts (like
checkout processes, or maps). I'm not saying for the entire website, but just
little focused parts. Can you imagine going back to (the old) Yahoo Maps now
that Google Maps exists?

Are SPAs good for entire websites (like the new reddit?) Personally I like the
new reddit over old reddit. But this is where your users are highly relevant.
If you don't listen and choose SPA for millions of users of the public, you
will probably get Feedback. You might be lucky and only 10% hate it, but it
might explode in your face if that's a key 10% that encompasses influencers.

------
adamsea
It’s easier to hire javascript developers, no? Even if it’s not the most
efficient.

~~~
adamsea
For more context, what I was thinking was:

Being able be assured that you can find/hire folks to work on your tech stack
is a pretty important concern for a business. Thus, it can make sense to go
with a popular framework/language because it can help in that regard.

