
jQuery CDN having SSL issues - justindocanto
https://code.jquery.com/
======
Someone1234
It is too bad that the HTML standard has no built in way to fallback.

They've added a cryptographic hash/integrity and the async/defer attributes to
the script tag, but something as essential as a fallback if a script or
stylesheet fails to load (which the browser is best placed to know), has no
built in functionality.

Instead you're left doing JavaScript tricks which for missing CSS gets a
little ugly[0]. But CDN with local fallback (or visa versa) has been common
now for decades but yet no official support at all. Honestly if the integrity
attribute is specified the browser should just be able to fall back to a
cached copy it has (e.g. jquery.1.2.3.min.js has a crypto hash of ABC123, and
I have that file already).

[0] [https://stackoverflow.com/questions/7383163/how-to-
fallback-...](https://stackoverflow.com/questions/7383163/how-to-fallback-to-
local-stylesheet-not-script-if-cdn-fails)

~~~
robin_reala
I mean, the fallback mechanism is progressive enhancement. It’s a reliability
mechanism more than anything - if JS (or part of the JS) fails to load the
site should fall back to a version that potentially reduces the interactivity
but allows essential functions to continue.

~~~
Someone1234
Progressive enhancement is largely a myth.

A lot of libraries, JQuery, Lodash, Angular, Vue, React, Bootstrap's JS,
module loaders, etc aren't simply offering "improved interactivity" they're
offering core functionality. In essence the site runs on these libraries, if
you remove them there's nothing left to regress too.

I've worked in several companies and never seen progressive enhancement used.
It might have made sense back in the IE6 era when JavaScript was just for
whiz-bang, these days JS libraries are holding the whole site's data
context/state and generating Ajax as needed (Vue, Angular, React, etc). That's
core, there's nothing progressive that can be removed from that.

Progressive Enhancement only makes sense for small toy sites or for academics
to play with. Even Netflix's famous examples are about web services going
offline, not losing core JavaScript libraries.

~~~
seba_dos1
No. It's just a problem of laziness, and possibly ego, of developers.

There are apps using browser as execution environment and there are websites.
You wouldn't expect a client-side drawing tool, WebVR game or real time
visualization of blockchain transactions to work without JS enabled.

However, you can easily expect a social network, mail client, news page, task
app, and to some extent even things like IM to work with no JavaScript.
"That's core" is just an excuse for poor architecture - it's only core because
you chose to make it so.

There are apps and there are websites, with only some small part of grey area
in between. If you're a web developer wanting to use newest, greatest trendy
tools, you see everything as apps, despite of common sense suggesting
otherwise, and you end up with no progressive enhancement for no good reason.
When you take it to extremes, you end up creating such abominations like the
old SPA Twitter frontend, spinning the fans of your laptop for 15 seconds just
to display 140 characters of text, because "the core" is implemented as AJAX
calls and fully rendered client-side.

~~~
Someone1234
> No. It's just a problem of laziness, and possibly ego, of developers.

We're talking about large organizations. No single developer is making these
decisions. And the question is about resource allocation, if the choice is
between improving the core experience or implementing an experience that <1%
of our users will ever see, the choice is easy.

> "That's core" is just an excuse for poor architecture - it's only core
> because you chose to make it so.

It is core because the internet has democratically made it so. You're speaking
for a very vocal minority. We're choosing not to implement a special mode for
people who self-selected to receive a broken web experience. Fortunately that
same demographic knows how to resolve the issue they caused.

> no progressive enhancement for no good reason.

A richer user experience is a very good reason. If the choice is between
making the site richer and more immersive for 99% of users, and leaving 1% of
users who wish to be contrarian for no reason out in the cold? So be it. A
worthy sacrifice, in particular as this 1% selected themselves for punishment.

You're welcome to pick and choose any arbitrary part of the web to disable,
maybe JavaScript, maybe CSS, maybe font rendering entirely, maybe disable
images, but it gets a little silly when you blame others for your self imposed
breakages. You don't want it broken? Don't break it.

~~~
robin_reala
I’d dearly love to have stats that aren’t a few years old, but when GOV.UK
last ran an experiment it was 1.1% of people who arrived without JS (~1m page
views a month) split into 0.3% who’d deliberately disabled JS and 0.8% who had
a broken JS environment for another reason. Like I said, it’s a reliability
issue, not pandering to people who choose to go out the way to ‘break’ their
environment. So yes, by choosing a JS-only environment you’re prioritising
developer needs over user needs. That’s potentially fine if the cost balance
equation works out that way for you, but it’s a specific choice you’ve made to
not support people who through no fault of their own don’t meet the
requirements of the environment you’ve decided to create.

~~~
Someone1234
> So yes, by choosing a JS-only environment you’re prioritising developer
> needs over user needs.

Even according to your own statistics, we're prioritizing 98.9% of user's
needs over 1.1% of user's needs (or more accurately 99.2% of users against the
broken 0.8%, since we won't do anything for the 0.3% who decided to break it
on purpose).

Resources allocated to 0.8% of the userbase aren't free, they come from time
that could be better spent improving the experience for everyone else.

> That’s potentially fine if the cost balance equation works out that way for
> you, but it’s a specific choice you’ve made to not support people who
> through no fault of their own don’t meet the requirements of the environment
> you’ve decided to create.

That's fine. The same 0.8% with a broken browser or proxy won't find elsewhere
on the internet any more friendly to them. The best they can hope for is a
small slither of sites with fallback, but the user experience will be so
terrible they're better off just fixing the issue than continuing.

I find it funny that people spent years making these same arguments, but used
assistive technologies as their cornerstone, now that assistive technologies
(and the aria standards) fully support rich JavaScript sites, the argument has
shifted to some hand waving minority that cannot even be quantified. We both
know this is really about the NoScript crowd (and similar, like
RequestPolicy), the other people with a broken web experience have far more
significant issues that no one site can hope to mitigate.

------
jakobdabo
I always self-host my JS/CSS libraries: the connection is already open (thanks
to keep-alive) so what's the problem of serving a couple of more KiBs of
compressed data instead of making an additional DNS request and a new
connection to a CDN?

I understand that the CDN version of the library may have already been cached
by the browser while visiting other websites, but does it really save that
much time/traffic compared to self-hosting?

~~~
penagwin
I'm not taking a side, just trying to add some numbers. Let's ignore the
privacy/uptime concerns for the sake of this comment.

If every site you visit has 350kb of stuff that would benefit from a CDN JS
but also some CSS and fonts (google fonts, bootstrap, etc.) If you visit 50
pages a day in a 30 day month, that's a little over 500mb of data.

.35mb x 50sites x 30days = 525mb

That would be a ton of easily avoidable data in regards to mobile plans
depending on where you are. This number isn't 100% accurate though, many
"normal" (read - not techy hackernews readers) might only visit say a dozen
sites a day or less (let's ignore apps like facebook/snapchat/etc). Even that
might be a stretch.

Then again students and other "savy" users might be going across hundreds of
new sites a day.

For you the host? Unless you're a massive beast, most of us "hobbiests" fit
within the free bandwidth of 5$ vps services anyway.

~~~
robin_reala
That’s assuming every site is using the same CDN and the same version of the
library. Seeing as that’s not the case you can cut that by at least an order
of magnitude. Secondly, most users visiting 50 pages a day will not visit 50
sites a day but more like 5 pages a site across 10 sites, or even 10 pages a
site on 5 sites. Now let’s add in to the user the external privacy cost of
being tracked across multiple sites and it starts to look a little less
appetising again from the user’s point of view.

------
keane

      Original, jQuery CDN:
      https://code.jquery.com/jquery-X.Y.Z.min.js
    
      Google:
      https://ajax.googleapis.com/ajax/libs/jquery/X.Y.Z/jquery.min.js
    
      Microsoft:
      https://ajax.microsoft.com/ajax/jquery/jquery-X.Y.Z.min.js
    
      Microsoft ASP.NET:
      https://ajax.aspnetcdn.com/ajax/jquery/jquery-X.Y.Z.min.js
    
      jsDelivr:
      https://cdn.jsdelivr.net/npm/jquery@X.Y.Z/dist/jquery.min.js
    
      cdnjs:
      https://cdnjs.cloudflare.com/ajax/libs/jquery/X.Y.Z/jquery.min.js
    
      Yandex.ru:
      https://yastatic.net/jquery/X.Y.Z/jquery.min.js

~~~
zackbloom
Does anyone have a performance benchmark comparing them?

~~~
keane
While Google and Microsoft are not included, JSDelivr
(StackPath+fastly+CloudFlare+Quantil) compares themselves to MaxCDN (now
StackPath which is the official jQuery CDN) and CloudFlare (the cdnjs
provider) here: [https://www.cdnperf.com/cdn-
compare?type=performance&locatio...](https://www.cdnperf.com/cdn-
compare?type=performance&location=world&cdn=cloudflare-cdn,jsdelivr-
cdn,maxcdn&datefrom=2018-3-19&dateto=2018-4-19)

KeyCDN has an online asset performance tool that we can use to compare the
hosted jquery.min.js files. The numbers included here are results received (to
the San Francisco location) in ms of [DNS lookup time] / [time to connect to
server] / [overhead of TLS connection on individual asset] / [time from client
HTTP request to receiving first byte from server]:

Original, jQuery CDN:

[https://tools.keycdn.com/performance?url=https://code.jquery...](https://tools.keycdn.com/performance?url=https://code.jquery.com/jquery-3.3.1.min.js?nocache=1)

8 / 2 / 79 / 85

Google:

[https://tools.keycdn.com/performance?url=https://ajax.google...](https://tools.keycdn.com/performance?url=https://ajax.googleapis.com/ajax/libs/jquery/3.3.1/jquery.min.js?nocache=1)

32 / 2 / 132 / 155

Microsoft:

[https://tools.keycdn.com/performance?url=https://ajax.micros...](https://tools.keycdn.com/performance?url=https://ajax.microsoft.com/ajax/jquery/jquery-3.3.1.min.js?nocache=1)

128 / 3 / 122 / 130

Microsoft ASP.NET:

[https://tools.keycdn.com/performance?url=https://ajax.aspnet...](https://tools.keycdn.com/performance?url=https://ajax.aspnetcdn.com/ajax/jquery/jquery-3.3.1.min.js?nocache=1)

128 / 3 / 114 / 120

jsDelivr:

[https://tools.keycdn.com/performance?url=https://cdn.jsdeliv...](https://tools.keycdn.com/performance?url=https://cdn.jsdelivr.net/npm/jquery@3.3.1/dist/jquery.min.js?nocache=1)

64 / 3 / 118 / 129

cdnjs:

[https://tools.keycdn.com/performance?url=https://cdnjs.cloud...](https://tools.keycdn.com/performance?url=https://cdnjs.cloudflare.com/ajax/libs/jquery/3.3.1/jquery.min.js?nocache=1)

64 / 2 / 118 / 125

Yandex.ru:

[https://tools.keycdn.com/performance?url=https://yastatic.ne...](https://tools.keycdn.com/performance?url=https://yastatic.net/jquery/3.1.1/jquery.min.js?nocache=1)

32 / 139 / 667 / 993

When I tested them, only jsDelivr and cdnjs/Cloudflare recieved green results
(under 200ms time to connect and under 400ms time to first byte) from all 16
worldwide test locations. Averaging the results between these two across 16
locations, I would go with jsDelivr who had a faster average TTFB. The fact
that they are combining CloudFlare, Fastly, StackPath, and Quantil (who I had
never heard of until today) might explain their global results.

------
nwah1
Great opportunity to strip out unnecessary uses of jQuery, and move to vanilla
javascript.

[http://youmightnotneedjquery.com/](http://youmightnotneedjquery.com/)

~~~
sergiotapia
I haven't used jquery in about 2 years, pure javascript and maybe some lodash
functions imported into my es6. Give it a try, you might not need it!

~~~
madeofpalk
youmightnotneedlodash dot com

------
guessmyname
Years ago, I used to link the library from Google [1] or CloudFlare [2].

Nowadays, with all the Node.js stuff that goes around modern front-end, I
don't see the point of embedding a JavaScript library from a CDN, unless that
library is dependent on a remote service, e.g. Google Analytics, Google Maps,
etc… That being said, if you are still maintaining a legacy website that
depends on jQuery, you should consider to embed the library like this instead:

<script>window.jQuery || document.write('<script
src="/js/jquery.min.js"><\/script>')</script>

[1]
[https://developers.google.com/speed/libraries/](https://developers.google.com/speed/libraries/)

[2] [https://cdnjs.com/libraries/jquery](https://cdnjs.com/libraries/jquery)

~~~
2trill2spill
> Nowadays, with all the Node.js stuff that goes around modern front-end, I
> don't see the point of embedding a JavaScript library from a CDN, unless
> that library is dependent on a remote service, e.g. Google Analytics, Google
> Maps, etc… That being said, if you are still maintaining a legacy website
> that depends on jQuery, you should consider to embed the library like this
> instead:

What does Node.js have to do with deciding whether to get your static assets
from a public CDN or not? I hope your not serving your static assets with
Node.js.

~~~
luhn
I assume he's talking about NPM. Now that everybody's hot new SPA has a few
hundred thousand NPM dependencies, you roll it all up using Webpack and can
just as easily `npm install jquery` as `<script
src="cdn.jquery.com"></script>`.

~~~
2trill2spill
> Now that everybody's hot new SPA has a few hundred thousand NPM
> dependencies, you roll it all up using Webpack and can just as easily `npm
> install jquery` as `<script src="cdn.jquery.com"></script>`.

Just because you can doesn't mean you should. You usually don't want some huge
javascript bundle to load it's bad for page load performance. Also Webpack
allows you to chose whether you want to bundle a dependency locally or grab it
at runtime from a CDN or another sever.

------
908087
One more reason to use Decentraleyes.

[https://decentraleyes.org](https://decentraleyes.org)

~~~
newscracker
I came here to suggest this (I use it on Firefox), but unfortunately, this is
not an option for a set of users on smartphones and tablets.

~~~
908087
Works great on Firefox/Android, but yes unfortunately that leaves iPhone users
out in the cold still.

------
Murrawhip
It's not an expiry. It's a cert name mismatch. CN is *.ssl.hwcdn.net

~~~
calcifer
My guess is they decided to switch to Highwinds as their CDN (don't know what
it was before) and they didn't plan it correctly.

~~~
johntrimis
FYI: Stackpath owns Highwinds. It was likely an internal configuration issue
at Stackpath.

------
jimaek
[https://www.jsdelivr.com](https://www.jsdelivr.com) is a good alternative. We
actually monitor for https failures and automatically remove the problematic
CDN.

------
leepowers
This is why you self-host all project dependencies.

~~~
Someone1234
But that costs you cache hits, increases the real world size of your site, and
slows loading. I'm sure mobile users with metered internet would prefer you
didn't download that 100 KB JavaScript library for the nth time.

~~~
leepowers
It's more complex than that. Mobile networks are mostly hampered by latency -
each additional HTTP request to a different domain requires another TCP
coldstart and handshake, DNS lookup, TLS setup, etc.

Many times the 100KB of JavaScript is faster to load when minified and
combined with other site code and served compressed over a single HTTP request
or streamed via HTTP/2\. It's almost always faster to use an existing
connection than to start a new one.

Also there isn't one canonical version of jQuery. There's dozens of potential
versions available[1]. So it's not immediately clear that a user will have the
version a site depends on.

[1] [https://mathiasbynens.be/demo/jquery-
size](https://mathiasbynens.be/demo/jquery-size)

~~~
Someone1234
It isn't faster than not re-requesting the resource at all because it was
previously downloaded from the CDN which is what the discussion is about.

------
kreitje
Looks like it's working again.

------
justindocanto
Starting to see complaints, questions, etc. on twitter about it too.

[https://twitter.com/search?f=tweets&q=jquery](https://twitter.com/search?f=tweets&q=jquery)

------
dusan76
Looks like its working now [https://code.jquery.com](https://code.jquery.com)

------
rharb
Potentially related to the Chrome 66 update and Symantec stuff?

~~~
justindocanto
Broken on latest versions of Safari, FireFox, Edge, etc. as well

------
8bitben
Yep, this just broke my project :/

------
campuscodi
Looks like they fixed it

------
petraeus
Only hobby websites would host jquery off a cdn

