
Stop Breaking the Web - bevacqua
http://ponyfoo.com/articles/stop-breaking-the-web
======
ne0phyte
Considering that the average page size (first load) is getting bigger and
bigger there is certainly some truth to it. Pages are getting slower and more
complex and I see more and more browser tabs take 80mb+ for (seemingly) simple
sites.

I also agree that web site/web app is slightly different, but: there are blogs
and sites that load, show a blank page, then fade in some animation/spinner
while loading fonts, styles and more scripts and then slowly fade in the
content. However, if something goes wrong (font doesn't load, JS blocked, etc)
you see a blank page with a spinner forever. HOW is that progress? That's just
awful.

I recently decided to start a little blog[1] using the static site generator
hugo[2] and I don't feel like I'm missing out on anything. Well, you won't be
able to load the disqus comments and have no code highlighting without JS, but
that's it.

[1] [http://code-and.coffee](http://code-and.coffee) [2]
[http://www.gohugo.io](http://www.gohugo.io)

~~~
parris
Consider that when most people get sites that don't meet their needs their gut
reaction is to open 20 tabs and mentally slice data from all over the website.
That doesn't seem very progressive either.

~~~
ne0phyte
I am not saying there is no need for more complex web sites. Personally I just
think it is sad that in 2015 there are lots of sites that won't load anything
at all if you disable JS or have a bad connection.

With a 50k connection at home I simply can't accept waiting 3+ seconds until
your site loaded its 500kb of JS and 4 different web fonts to finally show
something.

~~~
parris
Sure! Parts of that are indeed ridiculous.

Although I'm pretty sure the download size from a CDN is far less important
from a performance standpoint (within reason). More important things include:
\- Time to first byte from a slow'ish server (.5s to create the HTML and .5s
to deliver). \- How many separate requests you are making (somewhat going away
with http2)

Can we start ranting about the excessive amounts of tracking pixels everywhere
now :)?

------
parris
This is non-sense. Every few minutes you hear someone complain about the lack
of progressive enhancement. The case for progressive enhancement is only valid
when you have "content" that is easily parseable by human eyes.

In a world where you have too much data and users want to see slices of data,
progressive enhancement fails to deliver a fast, pleasant, engaging user
experience. In the case, where you have complex tools that help a user meet
some end goal more quickly progressive enhancement falls short of providing
the easy to use tools. The only case where this works in any way shape or form
is when a site is submittable via simple forms and all data can be retrieved
by visiting urls. Any, slightly more complicated use case fails to be
delivered at any reasonable pace with any sort of reasonable performance. You
know the kind of performance that wouldn't make your server side fall over.

This whole progressive enhancement thing is mired in decade old dogma. While
progressive enhancement can work sometimes, it is NOT the only tool. We
shouldn't wholesale prescribe solutions without knowing someone's problem.

~~~
JoshTriplett
Progressive enhancement does make sense in some cases, such as using a feature
that not all browsers support yet but that the site can function without.

And progressive enhancement makes _perfect_ sense for things like CSS; if your
site content makes no sense with CSS turned off, it probably makes no sense
with a screen reader. So, for instance, remember to put content in a sensible
order _in the HTML_ rather than arbitrarily rearranging it with CSS.

 _However_ , I no longer think progressive enhancement makes sense for things
like JavaScript, as long as you use features just about every browser
supports. Otherwise, you'd have to effectively write your site twice: once
with JavaScript, and again as entirely forms/links and server-generated
content.

The critical reason why I no longer think this makes sense: because it's
completely sensible to follow an API-first approach to site design, where the
first thing you write is an API usable _both_ by third parties and by your own
first-party site. Then you can write your site on top of your own API. I don't
think we need to target human-readable first; on the contrary, I think we get
a better, more extensible, more programmable, more open web if we build APIs
first and foremost.

Now, all that said, there are other ways this article is _completely_ right
about not breaking the web. In particular, having an app is no excuse not to
have a website, or one not usable on mobile. And if you're going to display a
"you might want the app" banner, have a "go away and stop asking me" that
_does not break navigation to the specific page the user was trying to visit
in the first place_.

~~~
lnanek2
I think the article was pretty clear about API first being bad, and even
linked to Twitter's write up of having to tear out that failure:
[https://blog.twitter.com/2012/improving-performance-on-
twitt...](https://blog.twitter.com/2012/improving-performance-on-twittercom)

~~~
JoshTriplett
I don't actually think an entirely client-side-rendered application makes
sense; on the contrary, I _do_ think it makes sense to do most HTML generation
on the server, and where appropriate, hand out snippets via the API, rather
than handing out JSON or similar and making JavaScript produce HTML. But
that's different than rendering _entire HTML pages_ entirely on the server.
And for primarily dynamic content, I see nothing wrong (for many sites, at
least) with assembling server-provided HTML snippets from JavaScript into a
server-provided HTML base page, without any fallback to an entirely server-
provided page.

------
nosideeffects
Getting a little tired of reading these rants. People are building
"applications" that just happen to run in the web browser, for convenience.
They aren't building repositories for static information.

~~~
vcarl
Yeah, these rants never account for the fact there there are a number of
different types of websites, and tools for each. Don't use Angular or React
for a blog for the same reason you shouldn't use Jekyll or whatever for an
application. Different tools for different tasks.

~~~
idonthaveaname
The points apply equally to those things you call "applications" as they do to
those things you call "sites". They're essentially the same thing. They're all
functionality delivered over web technologies and for that reason, the OP's
points still hold. In the case of a "web site" the functionality is
"displaying content".

An "application" can be built using progressive enhancement. The OP even
describes ways to approach this. There were applications before there were
AJAX or Angular. Their experiences leave a little to be desired, sure, but
they were certainly usable. A basis in simpler, server-centric interaction
with enhanced, client-side experience jazz is still more in-keeping with "how
the web was made", and with how it is consumed.

------
atg2
Think about what happens with the "old web". You make an HTTP request to a
server. The server evals short snippets of dynamic language inside a template
to generate a really big string. Then the server sends that string down the
socket.

If you want to view another object, you have to request another slightly
different big string.

Because of how resource-intensive it is to assemble these strings, programmers
deployed "caching software" to memoize every big string that came up. Server
hardware was loaded with RAM so that they could store a lot of big strings.

\- - -

The new approach is to send the basic code of the website to the client _the
first time_ , then have the client make additional requests for data via AJAX,
assembling the UI as it goes. Sanity.

------
mark_l_watson
I very much agree with this. Get the content to the user as fast as possible
and then progressively enhance.

Most of the web sites that I get value from really just need HTML and CSS.

That said I don't mind waiting for rich web email clients, web version of
Office 365, etc., that I leave open a while.

Complex web pages that are really just showing content also eat up a lot of
CPU and memory resources.

------
hliyan
I'm sorry, but I'm still trying to understand what the author is trying to
say. It seems like he had a bad experience with Angular and he's taking it out
on hash routing and client side rendering (so basically, SPAs?). Beyond that,
I'm not really sure.

------
carsongross
Absolutely. We have thrown the baby, and the sink, out with the bathwater.

I'll plug my little strike back against the insanity:

[http://intercoolerjs.org](http://intercoolerjs.org)

You can build a highly dynamic web site with normal URL schemes, sane HTTP
caching behavior and zero client-side templating.

I need to do some more work on making it meet progressive enhancement goals,
but I'm convinced this is a better approach than heavy client side logic for
most web applications.

------
thoman23
It's been a while since I've seen anyone take a stand for IE6 users.

As a web application developer whose bootstrapped startup needs every user it
can get, let me just say: I am perfectly willing to forgo all IE6 traffic.

~~~
tlarkworthy
Yeah, its a neglible share. The article lacks a grounding in utility. You
shouldn't bend over backwards for 0.1% of leads. If your product looks 10%
cooler for your best leads at the cost of not working for the 0.1%, it is
actually a good deal.

~~~
compbio
Web design and development is also an art. Good artists look beyond business
value.

You actually should bend over backwards to cater to as many users as possible,
especially if you are getting paid to build websites for users.

Some countries require sites to be accessible to the disabled. If you design
sites for the US government, they should also be accessible to 0.1% of blind
or no-script users.

Sure, IE6 as a baseline is very progressive, but it is certainly doable. Less
so, if you start with an inaccessible website and catering to as many users as
possible is an annoying time-consuming afterthought.

If you can not muster an accessible progressive enhanced website, then you can
not muster a js-only ARIA compliant website either. Your only hope is to make
something profitable. That's being a marketeer or business man with a little
HTML skills, not being a solid web dev.

------
bnolsen
These companies need to realize that end users, especially me, now pretty much
expect their apps and pages to:

    
    
        - annoy me for their benefit
        - spy on me
        - waste my battery
        - waste my bandwidth
    

interesting these companies don't seem to care much about gaining the trust of
their customers or actually serving them in a convenient fashion.

~~~
krapp
Companies don't care, because end users, for the most part, don't care. At
least not enough to significantly change their behavior.

~~~
dredmorbius
The point is to make them care. There _are_ ways of doing this.

Google are steering progress toward HTTPS and app-add interstitials, for
example.

------
ilovecomputers
To a lot of you arguing that Progressive Enhancement doesn't apply to "web
apps," as Jake Archibald argues[1], that really isn't an excuse if you can't
distinguish it from what you consider a "web site." Is your content static? Is
interaction core to your content (like a video game or a data viz explorer)?
Even if interactivity is core to your content, why not first offer something
like a product page. Describe about your interactive content in text or in
images or in a demo video. Just the minimum static content a user can quickly
download before the rest of the richer content is loaded. It's not much of an
extra effort; you aren't back porting your interactivity to less capable
devices. Instead you're using static content to show what your game could be.

[1] [https://jakearchibald.com/2013/progressive-enhancement-
still...](https://jakearchibald.com/2013/progressive-enhancement-still-
important/#app-is-not-an-excuse)

------
AdmiralAsshat
So for the novice web developer, what _is_ the recommended course of action
for creating a "modern," mobile-friendly website? The options seem to be:

\- Code all HTML and internal CSS by hand

\- Use a CSS framework

\- Use something like Bootstrap or other aesthetic rendering powered by
javascript/jquery

The problem seems to be that the less work done by the designer means
increased reliance on Javascript or external engines that then cause
additional overhead, bandwidth, and possible breakage for people using
NoScript or AdBlock. And to be honest, I'm sorta on their side, as it seems
silly that a single website needs to reach out to three or four outside
domains just to style the page properly.

What's the solution here?

~~~
carsongross
As I've mentioned elsewhere, the best solution that I've been able to come up
with is intercooler:

[http://intercoolerjs.org/](http://intercoolerjs.org/)

You can build dynamic websites with minimal or no javascript and just use the
same old techniques you are used to.

------
mind-blight
If web developers want to build applications for anyone outside the West,
they'll need to work on faster page loads. A large percent of the world views
the the internet on mobile devices with a slow connection.

Blocking content until a large JS file is downloaded, at least one more ajax
request is made, and another round of downloads occurs is unacceptably slow in
a developing country.

Isomorphic applications or server side rendering gives a company a competitive
advantage in these markets.

------
Geee
Bullshit. The author is confusing web sites and web apps.

~~~
jimktrains2
No, I think many people do. People are beginning to treat all websites as if
they should be web apps.

Look specifically at WIX sites, Google Groups, and Blogger these should not be
web applications -- they are standard websites, repositories of information
that need not live-update or run any javascript what-so-ever for the most
basic of use-cases -- are entirely, 100% unusable without Javascript.

Browse the web without javascript, you'll find that most things are broken,
even static content just doesn't work. That's the problem. Web Apps are cool.
Your blog _requiring_ or a news group _requiring_ javascript to read is not
cool.

~~~
fixermark
Part of the problem is that the web content and security model requires one to
consider a domain somewhat isomorphic to an app.

It wouldn't be so bad if the technology existed to reliably say "foo.com is
serving data that can be rendered via the some-standard-component presentation
layer." Then, the browser itself would have the capacity to cache and optimize
the rendering layer for common content schema. As it stands, your choices are
to code to one schema (the HTML5 / CSS / no Javascript presentation standard,
with all the restrictions that implies), to write a web app that will be a
special little snowflake different from all the other special little
snowflakes on the web, complete with its own chunk of memory and processing in
your browser, or to throw up your hands and commit your content to one of the
existing solutions that build the app for you (but then basically also control
your data).

We're missing a functional division between data and app that we have in the
desktop model.

------
dsfsdfd
How about we parse the html and css on the server, because, you know the user
might have a browser with a slow rendering engine. Then we just ship them the
images.

~~~
sbruchmann
I hope you’re being sarcastic.

~~~
dsfsdfd
indeed

------
ahallock
Puts a damper on the SPA + AWS Gateway API + Lambda app I was developing.

------
edw519
Here is what I think about this...

    
    
      =============================
      |                         X |
      |                           |
      |  Would you mind taking a  |
      |  survey to give feedback  |
      |  about this Hacker News   |
      |  comment? It will only    |
      |  take 5 minutes and you   |
      |  can win a new Macbook    |
      |  Pro.                     |
      |                           |
      |   YES      NO     LATER   |
      |                           |
      =============================
    

In conclusion, stop breaking HN.

