

Replaced JS with HTTP request - jimaek
https://github.com/jsdelivr/jsdelivr.com/commit/c958acc742a1bb8d2966ff48a09632284e1dfa05

======
phlo
I'm very wary about loading JS from random CDNs. In my opinion, the negative
aspects outweigh the benefits by far:

    
    
      - The CDN gets to decide *what* code is delivered to *which* users. Could be a prime target for, say, another FERRETCANNON.
      - If the CDN is compromised, so is your site.
      - If an attacker on a local network manages to inject poisoned cache data into requests for said CDN, your site is compromised.
      - All of your visitors are disclosed to the CDN owner.
      - If the CDN goes down, your site does so, too. Note that the inverse doesn't apply: the CDNs superior availability has no positive effect on your site.
      - Loading from another host may cause an unnecessary DNS lookup and will cause an unnecessary TLS connection.

~~~
montag
What is FERRETCANNON?

~~~
zimpenfish
I'll take stupid NSA codenames[1] for 10 points, please, Alex.

[1] [http://www.theatlantic.com/technology/archive/2013/10/how-
th...](http://www.theatlantic.com/technology/archive/2013/10/how-the-nsa-
thinks-about-secrecy-and-risk/280258/)

------
myhf
The link title is missing the operative word "single". It's about changing the
number of requests, not the type of request.

~~~
jimaek
I know. I am 100% sure I copy pasted the title from git but it somehow changed
by itself.

~~~
dylz
HN changes titles to match the source title

~~~
lorenzk
But the source title is in fact more precise this time: "Replaced js with 1
single HTTP request".

------
rdw
Doesn't this defeat one of the purposes of Javascript CDNs, that the user
already has the exact url cached on their machine?

~~~
mmahemoff
I'd say it's more like a tradeoff than defeating the purpose.

And not as much of a tradeoff if a lot of nearby visitors have also visited
the same site, or users visit it frequently.

------
antihero
What would be cool is <script sha="2afdb28d" name="angular.js"
version="1.2.10" src="xxx"></script>

This would mean that browser can essentially cache the exact version of the
script from _any_ source, verify it with a hash, and still have a fallback URL
to download it from.

~~~
arcatek
But then the game would be to find an hash collision with a malicious code :)

~~~
infogulch
It should definitely require the full hash. And to make it generic...

<script hash="sha-1/fa26be19de6bff93f70bc2308434e4a440bbad02"
name="angular.js@1.2.10" src="xxx"></script>

The name attribute would be purely aesthetic, so no reason for a version
attribute. SHA-1 is good enough for git content-addressed-storage, it's good
enough for browsers.

------
pearkes
In my opinion Google Hosted Libraries[1] are the best way to go if you're
serving "big" commonly used JavaScript libraries to folks.

One of the major upsides is that it's so heavily used, that a user probably
already has it cached in their browser. At least that's the idea, not sure of
actual numbers.

[1]:
[https://developers.google.com/speed/libraries/devguide](https://developers.google.com/speed/libraries/devguide)

~~~
josteink
And we add another nail in the "decentralized" web's coffin and give Google
even more data.

Hosting a 32kb library yourself shouldn't be that hard in the days of 50mbps+
LTE mobile internet.

~~~
darklajid
I agree with this sentiment.

For development purposes ('grab the latest version, from Google because it's
convenient') I'd go with Google, for a production deployment not so much.

------
zachrose
Ah, the mythical HTTP batch GET.

Last time I tried this, I ran into the problems of 1) unbounded URL length
breaking down in old browsers, routers, etc. and 2) hobbled caching. (And also
going against the grain of REST.)

I'd be interested if anyone has actually done something like this
successfully. Did you have the issues I did? Was it worth it?

~~~
delluminatus
I don't think getting "batch" resources is against REST, Fielding himself
stated that it's not a problem as long as the URL for the batch is consistent
-- the batches are just separate resources, although they might have special
semantics to the client.

------
goldenkey
This defeats the purpose of defer and async [1]

[1] [https://developer.mozilla.org/en-
US/docs/Web/HTML/Element/sc...](https://developer.mozilla.org/en-
US/docs/Web/HTML/Element/script)

~~~
jey
How so? It's the same as the old HTML, but with fewer round-trips.

~~~
pipeep
defer and async are about eliminating round trip latency by opening multiple
simultaneous http connections. Multiple TCP streams are sometimes faster than
one. (I say sometimes, because negotiating a new connection has an overhead)

~~~
arunoda
But then they are not loaded in sync. That's not the purpose here. Works for
some situations.

------
thezilch
Great, but instead, I'll take parallel files over SPDY, any day.

