
Don't tell me what my browser can't do - technojunkie
https://www.christianheilmann.com/2016/01/16/dont-tell-me-what-my-browser-cant-do/
======
jakub_g
(if you don't know the author, he's a web evangelist, working previously for
Mozilla, now for Microsoft; now you can understand his frustration, in the
times of recent Chrome monoculture in the web)

I wholeheartedly agree with the author and wanted to post another long rant
here, but I resisted the temptation :)

In this particular example of a WebGL demo, I think it was not exactly
malicious, the issue was IMO most probably due to writing non-future-proof
code, and then abandoning it (there was a related post last week about writing
non-future-proof Python code that will break in python4, same applies here).
Lots of early "HTML5" demos do not work anymore because they used some not-
yet-standardized and/or prefixed syntaxes that have changed since etc.

Anyway, the link to [1] is very interesting. Many people underestimate the
risk of third-party JS server going down (or being slow), adblockers
preventing some JS from executing etc.

Edit: to clarify, I don't mean you should not use JS (heck I'm a JS developer,
and IMHO the times of "progressive enhancement" are long gone, unless you
serve almost only text content). I don't mean you should have 3 redundant
servers to serve your JS from; just detecting that the thing didn't load, and
not making assumption that it will be reliably delivered to the user in 0.1 s,
would be good enough. There are pages out there that will totally break or
fail to load if some non-important JS file is not loaded due to a network
issue or whatever else.

It doesn't require infinite amount of work to handle some cases like that
(though, sadly getting management buy-in for even small things like this is
not easy, and we might sometimes have to go against the flow for the good of
our users).

[1]
[http://kryogenix.org/code/browser/everyonehasjs.html](http://kryogenix.org/code/browser/everyonehasjs.html)

~~~
vonklaus
<sarcasm> hahaha I love it. Better yet:

Do your roads support horses?

Many people own horses in America still[0]

Have they got to the destination yet?

FACT: All people who are not traveling, are not traveling by horseback.
--socrates

If you're on a highway and suddenly the edgecase of therebeing a fuckton of
cars going fast on it occurs, suddenly they have to swerve to avoid a random
cowboy and get in an accident, then they don't make it to their destination.

Do your interstate or inter-province highways prevent horses? Loads of them
do[1]

You wouldn't download a car[2]? </sarcasm> ==================================

Look, I get it, provide a good user experience. That's fine, I appreciate
that. However, javascript is pretty ubiquitous, so what the fuck are people
supposed to do? Check headers and sniff browser on every request (something
the parent article implied was negative), then send 2 sets of javascript to
the user, one to call a cdn, and one to call a local script if download fails
or cdn not cached? Check if cdn is cached every time?

I mean, sure there is some pretty good practices out there and you should be
cognizant of that, but if you are just sending out a cat picture because
you're a bored 16 year old kid learning how to use a new web technology like
webgl, are we supposed to hold you to 5 9's worth of uptime and backwords
compatability all the way to mosaic.

Here is the answer (my answer really) you can do whatever the fuck you want,
to the extent that wtfyw overlaps with what your users want, you can keep
them.

[0][https://en.wikipedia.org/wiki/Horses_in_the_United_States](https://en.wikipedia.org/wiki/Horses_in_the_United_States)

A USDA census in 1959 showed the horse population had dropped to 4.5 million.
Numbers began to rebound somewhat, and by 1968 there were about 7 million
horses, mostly used for riding. In 2005, there were about 9 million horses.

[1][http://www.rita.dot.gov/bts/sites/rita.dot.gov.bts/files/pub...](http://www.rita.dot.gov/bts/sites/rita.dot.gov.bts/files/publications/journal_of_transportation_and_statistics/volume_07_number_23/html/paper_06/index.html)

[2][http://www.slideshare.net/arnoudengelfriet/you-wouldnt-
downl...](http://www.slideshare.net/arnoudengelfriet/you-wouldnt-download-a-
car)

------
dahart
Whether or not you like the delivery, it is a reasonably good point.
Personally, I'd have loved to read a little more how to do it right and a
little less why.

The how to do it right is important because WebGL adoption is growing, and
plenty of sites actually require it, as opposed to only having a tangential
demo or fancy logo on it that isn't strictly necessary. If your site
absolutely requires WebGL, and you block browsers incorrectly, you're hurting
your site drastically.

I don't know what the best way to do it is, and I'd love to hear what other
people do. My current thinking is: don't screw around with user agents, of
course. This might be where the OP's complaint came from, because if you try
to whitelist compatible browsers, new ones will always break immediately.

A better approach is put a canvas on your page, try to grab the webgl context,
and check whether it failed. That way, if webgl can work, it will, and you
only block people who have it turned off or actually have on old browser.

The message that announces the reason for failure is important, IMO. You can
put up some explanation for how to fix the problem, because sometimes its
fixable, or you can do something off-putting that only says 'fail' and chase
people away.

~~~
shockzzz
My objection is this paragraph:

 _You don’t have to support old browsers and terrible setups. But you are not
allowed to block them out. It is a simple matter of giving a usable interface
to end users. A button that does nothing when you click it is not a good
experience. Test if the functionality is available, then create or show the
button. This is as simple as it is._

lolwut

Giving a usable interface for old browsers is definitely a form of support.
You're telling me that my app has to support the Gopher protocol too?

At the end of the day, it depends on your Product and its Users. Some have to
support many browsers. Some don't. No one expects an American vending machine
to accept Euros as payment.

~~~
pdkl95
> Gopher

Nobody said anything about non-HTML markup. This is a straw-man argument.

> form of support

The entire point of implementing a proper progressively enhanced design is
that you _don 't_ need to add extra support for older browsers. Handling
missing features is important because "older browsers" isn't the only time
errors happen.

Progressive enhancement is mostly good error checking. You should handle
missing JS features (or missing JS entirely) for the same reason you should be
checking calls to fopen(3) for NULL; skipping that check means you simply fail
badly on errors. A webpage that sends an empty body tag is similar to a
traditional program that crashes without any error message because it tried to
use a NULL file handle when opening it's config file.

If your tools aren't handling a lot of this for you (Rails has since very
early versions), maybe get better tools or maybe bug vendor?

> usable interface

Nobody expects _identical_ functionality when the JS doesn't load. Obviously,
some features won't work, and it may not look as nice. Choosing exactly how to
handle an error condition is one of the many design decisions programmers have
to make. Skipping optional features like dynamic page rewriting (pjax or even
simple remote ajax forms) may be slower, but skipping those features and
leaving links/forms as traditional page-reloads is a better way to handle a
failed JS download than blocking the entire page with an error message (or
worse: leaving the person who visited your site with a blank page).

// I _suspect_ that a lot of the "interface" that is apparently so important
to get right is _advertising_.

------
Ianvdl
For some reason this article received a fair amount of negative feedback here.
I quite enjoyed it.

It's good to remind developers that they should ensure that their products
degrade gracefully, and to give a few examples of what could go wrong.
Mentioning the (somewhat basic, but still generally effective) fault-tolerance
of HTML/CSS vs JS along with the user's basic viewpoint also helps to get his
point across further.

Maybe I'm just used to reading longer form content, but I didn't feel that
this brief blog post was a waste of time.

~~~
mmcclure
> It's good to remind developers that they should ensure that their products
> degrade gracefully

I agree! I don't think it's a bad reminder, I think it just rubbed me the
wrong way that he seems to present this whole concept like something new and
novel, when frankly, it's just...not. The entire blog post doesn't have a
single mention of either "graceful degradation" or "progressive enhancement,"
both of which would be useful to hear if a reader wasn't familiar and wanted
to learn more about this topic.

------
alexandrerond
I use Linux+Firefox+Pipelight. Amazon Video used to work well with this, until
some bright mind decided to force Silverlight and to check browser agent
strings beforehand to make sure you are using a "supported browser version".

Of course, this bright mind never thought of people using pipelight. Well, now
I have to set my browser agent to pretend I'm on Windows or Mac to get things
working. Thanks Amazon for sucking so much.

------
bsder
Protip for Web programmers: "No, you can't count on Javascript."

Yes, I block your shitty Javascript. If I enable Javascript, I enable for your
domain only. The 18 other domains are not going to load.

Get over it.

~~~
tomjen3
You and 0.0001% of the other users.

That will probably not exceed a total revenue of more than a couple dollars,
so I guess I can get over that much quicker than you can find another site
that does the same but works without Javascript.

But then I am a game dev, it is basically impossible to make an HTML game work
without javascript and it is certainly not worth the investment.

Javascript is here to stay. You don't have to run it, but if you expect that
people care you will be disappointed.

------
LouisSayers
The problem is that it takes effort to support both with and without JS, and
effort costs money. Just like many developers I like to do things properly and
support both, however in a business context it doesn't always make sense to do
so. 80/20 rule.

------
justin_vanw
how do you write like 1500 words saying the same thing over and over and over?
Read this article for 1500 simple little examples!

In before "say something positive": If whoever wrote this gets to write a long
rambling complaint about some random website nobody has ever heard of having a
bad error message, I can equally complain about my pet peeve: "people writing
long blog posts whining about things that set them off but aren't the real
reason they are so angry to begin with".

~~~
Dr_tldr
Yeah, my immediate response was "Don't tell me who my customers are!" This is
all a bit like storming into a Chinese restaurant then berating them for not
having menus written in Slovenian.

His point about faulty sniffers throwing useless warnings and blocking content
that could load is rock-solid, but it's obscured but everything else he's
saying.

------
mindcrash
Seems a lot of people are doing the shit "we" (as in the older, sometimes even
somewhat grumpier developers) have seen in the 90s, as in the "best viewed
with Internet Explorer" shit -- making websites completely inacccesible to
Netscape, all over again.

But this time with Chrome versus every other capable browser on the market.

Sigh.

------
echochar
OK, so the author purports to be on the side of "end users". And he purports
to share empathy with users who get annoying messages that their browser is
lacking, outdated, etc.

As an "end user" who has gotten hundreds of such messages over the years, I
ask: How about Silverlight? Have you ever gotten an annoying message along the
lines of "Sorry, you need Silverlight"?

I have. And I assure you I do not need Silverlight to watch video.

Speaking of empathy, just this week I believe someone posted to HN about a
still undisclosed 0-day for Silverlight that someone wrote years ago, that
still works flawlessly and recently sold for tens of thousands of US dollars
on the black market.

------
cdevs
I think he's really made because when developers write error messages the
company doesn't like to suck it up and take the blame i.e. " we don't support
your current browser" vs "YOU don't have webgl"

------
vortico
I'm sorry that you wasted your time going to a web toy you were not able to
see! :'(

The developer made a mistake in his error handling routing so that it is
triggered by a false positive. This is likely because he made the demo on a
time budget, because being a demo, it is not important to be perfect.

------
kazinator
Okay, I won't tell you that your browser can't calculate, for every possible
Javascript function, and every possible input to that function, the Boolean
value indicating whether or not that function will halt. Don't mind me at all!

> _HTML and CSS both are fault tolerant. If something goes wrong in HTML,
> browsers either display the content of the element or try to fix minor
> issues like unclosed elements for you. CSS skips lines of code it can’t
> understand and merrily goes on its way to show the rest of it. JavaScript
> breaks on errors and tells you that something went wrong. It will not
> execute the rest of the script, but throws in the towel and tells you to get
> your house in order first._

HTML (or rather, processing thereof) being fault tolerant is a complete
misfeature. It's the result of browser in the 1990's trying to out-do each
other in handling broken web pages in order to look more functional. Broken
HTML should be loudly rejected (at least in pages which declare strict
conformance to a modern dialect).

CSS skipping stuff it doesn't understand is harmless because a declarative
language for assigning style properties. It doesn't have permanent effects,
like making a call to a server to update a database record.

A program in a general purpose language cannot reliably continue after an
error (without logic to handle that situation). Later steps depend on the
earlier ones having executed correctly in every detail.

~~~
rileymat2
>A program in a general purpose language cannot reliably continue after an
error (without logic to handle that situation). Later steps depend on the
earlier ones having executed correctly in every detail.

I don't understand this statement. Obviously, the program cannot preform
correctly, but that may not matter. For example, if a web page caches content
on disk for the future, but there is an error. It may not matter that the
particular disk write failed without any additional logic. The error would
just cause a slightly longer load time next time.

~~~
kazinator
> _It may not matter that the particular disk write failed without any
> additional logic._

Only if the disk write completely failed, leaving no traces of a half-broken
cache entry.

If you have a half-written cache entry and don't deal with that situation
somehow, you end up serving the user broken content from the cache.

It usually behooves us to have our code react to write errors.

