Hacker News new | past | comments | ask | show | jobs | submit login
JsDelivr – The advanced open source public CDN (hacks.mozilla.org)
230 points by rnyman on March 19, 2014 | hide | past | favorite | 105 comments



When you include 3rd-party hosted content in your program, you give those 3rd-parties:

   1. Access to your users' browser details (via request headers).
   2. Access to your users' habits (via the referrer header). 
   3. The ability to replace executable code. (This code can be used to gather more data from the state of the web page (e.g. keystroke logging), or break out of the browser sandbox and compromise the whole system. Note: this only applies to JavaScript.)
   4. The ability to break your software, either intentionally or unintentionally, by not serving the resource at all. 
When you host and serve your own resources, these are non-issues.

I think there is a sense among good-hearted programmers that "But gosh, they wouldn't do [1-3]! These CDNs are just a really nice public service by companies that care about keeping the web community healthy and fast. And besides, if they ever abused their access to user data, there would be hell to pay!"

Unfortunately this blasé attitude about user privacy on the part of programmers is why we have such a rough time with privacy in general these days. I believe that if you're a programmer with any interest at all in protecting your users from bad actors, that you'll do your best to avoid using 3rd party hosts, especially for executable code.


Thank you thank you thank you.

What's worse is when you see sites running JavaScript from multiple analytics companies, social media companies, and CDNs - even on pages that should be private (login forms, personal details, etc.). It's even worse when those sites break without those scripts. This is a concern that far too many developers overlook. I made this (harsh) comment [0] on such a site posted to HN a few months ago.

CDNs are way overrated, from what I've seen. I think, for most use cases, you're better off avoiding the multiple HTTP requests and just compressing all your JavaScript into a single file. Serve that from your own server and you'll save the client from another domain name resolution too.

Do yourself a favor and install NoScript. For one thing, it will show you how easily Google can (or, could have) track your internet usage - and, as you pointed out, so much more. In my experience, about 50% of sites serve scripts from some Google domain.

[0]: https://news.ycombinator.com/item?id=7187593


You're welcome! And I agree that the upsides of CDNs are overrated - but in fairness they do have the benefit of being very easy to use, high performance, and ease the bandwidth costs of applications. The geographical proximity thing is a nice feature, and expensive to reproduce. Of course, even these upsides come with their own downside: front-end developers have gotten into the habit of including every lib under the sun. Heck, it's just a script tag! This makes the page heavier, which makes the CDN even more indispensable, and it leads to a downward spiral of browser-hosted bloat-ware that requires a CDN to load in a reasonable amount of time. This is, of course, a win for the CDN sponsoring companies.

What's remarkable to me is how ubiquitous and unquestioned the use of CDNs has become.

BTW NoScript/Ghostery don't address the CDN issue. They don't actually block CDNs, they just block explicit trackers (mainly in the forms of iframes and scripts linked to known-bad hosts).


NoScript is configured by default to only allow scripts and "active content" to run from domains you allow. You can temporarily allow certain domains, or permanently allow them. So yes, it does address the CDN issue. If someone includes a script from apis.google.com, that script won't run until I permit content from that domain to run (and I only ever do that temporarily, and only if the site needs it). Likewise for any other domain: CDN, analytics, whatever.

I don't know much about Ghostery - NoScript addresses most of my privacy concerns around web browsing.


So this is something I wrote last week relating how I've come to believe that it's really not appropriate to rely on vendor code on public CDN's for anything more complex than a JSFiddle.

http://daemon.co.za/2014/03/from-trenches-js-cdn-point-of-fa...

Basically, if you need to include something like jquery, you have to have code that makes use of it. Since by definition, you can't serve that code from a public cdn, you are going to need to serve those static assets somehow.

I now believe that you should serve the vendor code from the same 'place' as you serve your application's static assets, because by relying on external resources, you are adding more moving parts that all have to be in perfect working order for your app to actually be able to run.

This doesn't really have anything to do with how reliable the CDN itself is, but rather with how reliable the client's connection is.

I did read the jsdelivr post, and it actually looks like a really well thought out system. I just don't think I will use something like this for anything where I have the choice not to.

IMO, the possible benefits of using a public CDN, don't outweigh the fragility that gets added. It just feels like it is trying to optimize best case performance while worst case is far more important.

I'm not against CDN's as a concept though, I just think you should serve all the code that is needed for normal operation from the same one.


It's very easy to add a fallback, though. If the CDN goes down, load the local version.

> This doesn't really have anything to do with how reliable the CDN itself is, but rather with how reliable the client's connection is.

If they have the CDNed version of jQuery cached then it doesn't need their connection to do anything.


you could just have loaded the local file the first time and have it work every time, without all those ifs and maybes involved.

it can take several seconds for the request to fail, during which your application won't have been loaded or working.

the fallback adds complexity to your code, which ultimately didn't need to be there.

I just think that it's better to reduce the ways that your application can break, before trying to make it a little bit faster for the cases in which all the conditions are met.

On principle, even in the theoretical best case scenario, I think it's not the right approach to be taking.

I also don't think that the best case scenario can be reliably enough predicted to be something you should try to optimize for. There are many CDN's, many different versions of the many different libraries. Especially on mobile devices, cache size is limited.

IMO, at least.


You're overthinking this. There are benefits to using a CDN beyond caching; you exaggerate the risk of the CDN going down, and the effort it takes to create a fallback for that. For most applications, a widely-used (cached) free CDN is a no-brainer and is nice to your end users.


not the CDN going down. the client being unable to reach the CDN.

the http request the cdn to get the file is not a 'free' operation. network timeouts take time.


I mean the CDN going down from the end user's point of view. Why the host is unreachable is a moot point.


I see this daily, but then I don't have a sub-20ms ping time and a cable modem where I live. The CDN doesn't have to go down. The user's connection doesn't even have to go down. Things just have to get momentarily flaky _somewhere_.

I'm working on a project that uses google fonts and the page loads with fonts timing out multiple times per day. I get to sites where just the html loads and none of the css (or worse: just some of the css) all the time. Especially on 3G, but regularly on ADSL too. Or what if you're sharing a saturated line?

(And waiting for a timeout to kick in before you trigger a callback is just not an option. By then the user probably closed the page anyway. Rather just make it work in the first place.)


You should optimize the common path, not the edge cases. Most of the time the user would be benefited (cached content = faster load) from using CDN. On the other hand, CDN being unreachable is not a very common scenario, so it is acceptable to be a little slower.


CDNs are already an edge-case optimization, provided that unique visitors is less than number of visits.


it can take several seconds for the request to fail

But if the client already has it cached it doesn't even make a request. CDNs have long expiries because the content of the JS lib will never change.

There are many CDN's, many different versions of the many different libraries.

But you're only using one.

Especially on mobile devices, cache size is limited.

All the more reason to use a CDN.


The common case is that the user doesn't have the file yet and then has to do another dns lookup (for the cdn) and establish another http connection (to the cdn).

There are so many public cdns now and so many versions of libraries scattered about that the chances that the user already has the resource cached is rather slim.

Your build tool might as well have combined it all and sent the same "cache this forever" header. And in the common case that would have come from the same host as some other files, so one dns lookup, one http connection.

I think public CDNs optimise an edge case for little benefit. But maybe it is clearer to me because I always have high latency where I live and my connection speed can be described as "medium" at best. I see sites only load some of their resources due to splitting files across CDNs on a daily basis.

I can understand that if you have a sub-20ms ping time to your local CDN and a cable modem this might be difficult to understand.

Making everything work or everything fail together is a lesson I've learned many times.


I think your statement is a bit extreme. While relying exclusively on a cdn to serve your static assets is risky, doing it using require.js with a local fallback works very well. You still benefit from cdn caching, while preventing your assets to become unavailable (even in China).


Really? Have you ever sat through the experience of waiting for the primary source to time out so that the fallback can kick in?


Ultimately, the timeout issue will be problematic for countries forbidding the access to cdns servers (ie China for example).

For the rest of the users, there are three cases:

- the user has accessed the application in the past (or any other webpage that uses the cdn): a cached version of the library is served by her browser. - the user has never accessed the page, the cdn is unavailable:

* either the Internet connection has failed, and our application won't be available anyway * the cdn is indeed down, the user has to wait for the timeout before her browser fetches the local fallback.

The third case sounds extremely rare (look at the statistics of major cdns out there, they often have better uptime and response time than your own servers). And the advantages provided by the first case do more than compensate in my opinion.


that's just added complexity, where it could all just work.

or fail consistently at the same time if your server isn't accessible. In that case it doesn't matter that the public cdn is still accessible.

i just don't think the possible benefit of public cdn hosted code can justify how many moving parts it adds.


It does not add complexity if your project already makes use of require.js, it is literally specifying an array of paths for a library instead of a simple string.

If my server is not available, my web application will ultimately fail anyway.


if you are worried enough about performance to start using external resources in this way, you are probably going to benefit much more by using r.js and a build step to build a single (or a few partitioned) javascript files.

https://developers.google.com/speed/docs/best-practices/rtt#...


We do use r.js and create modules (a main one that gets download systematically, and smaller ones for less used features). However as we update often, we don't want our users to download the large third party libraries we use at each occurence, so having them as separate downloads is actually much better.

For the large libraries we use that are hosted on a cdn, with a local fallback, we use the :empty value for the optimiser and they are exclude from our modules.


If you spread your resources over multiple domain names, the client can download more files in parallel.



you can shard your own domains too.


[deleted]


Funny you mention that! Posted today: http://www.w3.org/TR/SRI/


The same thing that is used for integrity checks could be used as a DHT hash, potentially allowing people to download the file from their peers where connection has the lowest latency, or from their favorite CDN they find the most reliable.

The same hash could be used similarly to etags with additional interesting properties. You could, for example, use it to safely retrieve an item that was cached for another website. (E.g. latest JQuery.) Not only it would improve performance, it would make user tracking somewhat more challenging for CDNs, improving privacy.

The potential here is huge, so it's disappointing that these ideas are smudged over multiple unrelated specs, and sometimes missed entirely.


You're correct, and all of those use-cases are mentioned in the spec, so I'm optimistic. This is from some of the guys involved in CSP, and that spec has been deployed well. Let's just make sure we voice our support for it.

EDIT: sorry - they mention caching and using fallback sources, they don't mention any DHT.


It's great if they consider those use cases

I couldn't find anything that mentions using the hash itself to locate the requested document (in DHT network or in local cache) in that spec. Did I miss it?


Yeah, corrected myself just as you replied.


Can I work for mozilla? I feel like they're only one of the few organizations that are doing cool things for the common good. Keep it up!


They may be, but JsDelivr appears to have no deep affiliation with Mozilla:

> Who is behind jsDelivr?

> Basically just me, Dmitriy A. or @jimaek. But a lot of other people are involved, giving their advice and helping me out. I always refer to jsDelivr in the plural sense because of this.

It is sponsored, amongst others, by CloudFlare and MaxCDN, where @jimaek is also employed. But the only connection to Mozilla seems to be that it's featured on the Mozilla Hacks blog.



I dont understand how this is paid for -- can someone explain?


> I should also point out that MaxCDN, CloudFlare, Cedexis and the rest of the companies sponsor jsDelivr for free.

Personally, I don't trust such services any more than I trust cloud hosting, which means not at all. I have no way to make sure that they server these files unmodified, that they aren't neglecting their servers to the point where they're hacked and serve malware to my users etc. ...


I would be happy to address these issues. You have no reason to believe me but all custom servers are secured and are regularly updated.

For CloudFlare and MaxCDN I have 2-Step authentication enabled + for MaxCDN 1 allowed whitelisted IP address.

Regarding unmodified files I guess we can build an app to monitor them? If you want drop by https://github.com/jsdelivr/jsdelivr and we can discuss this even further.

I would be happy to do anything possible to ease your concerns regarding security.


Do you have any sort of agreement that they will serve these files bit for bit unmodified?

I had a problem with cloudflare inserting some tracking cookies into my static HTML files. They claimed it was ddos protection. To me (and the EU cookie directive) it looked no different from the garbage other analytics sites use.

If jsdelivr providers are allowed to modify the files they serve, I won't use it. Got any sort of guarantees?

Thanks.


FWIW, the only "CDN" I've ever heard of to regularly pull stunts like that is CloudFlare; and really, that's their angle: it adds latency (which has been demonstrated in various commentary on the service) with the goal of modifying content to reduce the number of requests or improve client-side rendering times. It is more of a "content optimization" service than a "content delivery" service. If you want a CDN the tradeoffs (number of edge nodes, latency, cache sizes) are much better with other providers.

Sometimes, the stuff they inject also has horrible bugs ;P. One time, for an entire day, they were managing to lock up Safari entirely. Cydia is mostly a web browser, and one of the companies I work with apparently used CloudFlare, so Cydia suddenly stopped working that day in a way that was pretty catastrophic. I did a writeup on the process of discovering the bug (which I had to report to CloudFlare to get fixed: I don't even think they really had the expertise in-house to figure out what happened).

http://www.saurik.com/id/14


Good stuff! If you write this sort of article every so often, please set up an RSS or Atom feed.


They are not allowed to modify the files. CloudFlare had to deploy a special fix to disable all cookies and security functionality completely on my account. Only after the fix I enabled their CDN.

MaxCDN does not modify the files in any case.


I think with CloudFlare you should really get an agreement that states some kind of penalty if that "special fix" every accidentally breaks (their engineers seem pretty fast/loose with agile code changes that affect their customer's sites).


I think so too. I'm happy and reassured to hear that assets are served unmodified now, but do you have a guarantee that it will remain that way?


Modifying already hosted files is unacceptable. The fix by CloudFlare was a custom code that disables the Control Panel features completely. Even if I or someone else enables it, it wont do anything.

I will be in contact with them to make 100% sure the fix wont be reverted in any case.


To be frank, I'd like to be able to believe and trust you and work myself in a climate where this is generally safe to do. But we're no longer in the early 90's, so we can no longer base technical decisions that concern security on trust in other people's decisions and behavior (esp. when they are not on our payroll). Nor can we trust contracts and legal assessments when even governments are breaking the law.

There are no technical measures that can prevent you or the companies hosting these files for free from serving modified JS arbitrarily, or on some authority's demand, to everyone or to selected individuals. If we could put a checksum in the <script> tag, we'd be fine (to some extent, provided collisions are really hard to find - so no md5 please), but we can't, so we aren't.

Like many other web site owners, I believe security to be more important than my bandwidth or a few 10 more miliseconds of loading times, so I can't be convinced easily to use a public CDN for JS.

(and yes, I still have to trust other people for not backdooring other software and hardware I'm using, but I try to keep the attack surface as small as possible)


Here's one security concern - What procedure do you follow when accepting files for inclusion in jsdelivr?

I mean, what's to stop this scenario: 1. attacker uploads a poisoned version (say, with an XSS vulnerability) of a popular library to an official-sounding github repo 2. attacker raises a github issue with you asking you to put it on jsdelivr 3. you assume the attacker is a legitimate contributor or user of the library and add it to jsdelivr 4. other sites start using the poisoned version of the library 5. attacker can now carry out XSS attacks on the sites using the library

I have another security concern about www.jsdelivr.com (which I hope is totally separate from the CDN?) but I'll email that to you.


I validate all submitted libraries. I try to do size and md5 validation for everybody. Cases where I do minimum validation is when the author himself submits his library and for trusted people I can skip md5.

But once the auto-update app comes online this issues should become obsolete.

www.jsdelivr.com is completely separate from the CDN. Plus the code is open sourced so you can actually see how it works.


Add to this privacy concerns. I un-block such 3rd-party CDNs if and only if it breaks specific web sites which I want really want to un-break.


Cedexis, MaxCDN and CloudFlare all offer their services without any limits and for free. Same for all the custom locations we have from Hosting companies.

We are in excellent relations with all of our sponsors so no issues there.

For the rest stuff like Domains, SSL, Hosting, Freelancers, other I pay myself.


I wonder if you could somehow get Google as a sponsor too?

Maybe someone could convince them to adopt your multi CDN technology for web fonts? The reason why I want this is because in China the GFW periodically hangs connections to Google's IPs after searching certain keywords for 90 seconds. This causes any site that uses Google web fonts to hang until the connection is resumed.


I would love to have Google join jsDelivr but its hard getting in contact.

I guess Google is the one who knocks :p


The problem I have with this is the same I have with all public js CDNs: Your project doesn't just use public js files. You are almost guaranteed to also have custom javascript (which could itself be opensource or not. irrelevant), css, possibly fonts, images, etc.

You can't put your static files on that public cdn. So either you have to serve it up yourself or you have to put it on your own CDN (either one you run yourself or a managed one).

So you have an additional point of failure (because your static files are coming from at least two places rather than one) which multiplies the chances of something going wrong for little benefit in my opinion.

Deployments are likely to be much more complex too, etc.

And before someone mentions the limit of how many concurrent requests a browser will make to a host: That's also irrelevant. I'm not arguing that you should have loads of little files. You should still be combining things with build tools. In fact if you host those "vendor" files on your own webserver or cdn you actually have more flexibility around how to combine them. And you can still have multiple subdomains and ip addresses or whatever pointing to the same CDN to try and optimise around that if you want.


This is really cool. Up to the section on Advanced Features, it seemed like a clone of Cloudflare's cdnjs. But the multiple request feature is absolutely terrific, and I can't wait to use it.

EDIT: I wonder if the multiple-request URL's mean that the individual assets won't be cached, which is a big selling point of CDN's.


Grouped URLs are cached as always but for less time. You can check the headers.


I think the parent's point is that people will less likely to have a warm cache because it's less likely any other site has requested the exact same group of files.

(Though there's debate as to how much that matters.)


Personally I think they only file that is going to be already cached is jQuery from Google. So for the rest it doesn't matter. Just my opinion.


There is cdnjs.com that proposes the exact same features and has been available for quite some time.

I regularly submit new libraries and update the current ones through their Github repository.

The multiple files per request is nice however, but contributing to cdnjs might have been a better way of proposing it.


But according to the benchmarks, cdnjs.com is worse than google and jsdelivr. http://www.cdnperf.com/#jsdelivr,google,cdnjs/http/90


Exactly, we use multiple CDNs. We are the only project to do that. This is why our performance and uptime will be always better.


Better than Akamai? I think not.


Akamai does not offer a public CDN. So they are not a competitor. Apples and oranges


If by "public" you mean free, correct.

Akamai is cadillac of the CDN world so more like Apples and Organic Gourmet Apples.


Just be careful with those benchmarks, cdnperf is run by jsdelivr. An uptime and speed monitor based in one location is a pretty flawed way of analysing the performance of a cdn. As for Jimaeks comment above "multiple cdns" do not enforce better performance either.


I guess they are using Pingdom which has more than 50 locations for testing, but yes it would be nice to see per-location data.


This is correct. cdnperf uses multiple locations provided by pingomd and also makes sure all CDNs are tested from the exact same locations. If you have any suggestions you can submit an issue over here https://github.com/bebraw/cdnperf


Thanks for the link, we will be experimenting with cdnperf in the coming weeks, and make the move if we are happy with it.


I'd like to see something like a CDN, but for completely dynamic content. Something like a system that could host instances of a game server, with a framework that ensures clients make connections with very low latency.


Huh? The game server instances would still need to sync between themselves, so this really wouldn't address latency.

Rather then having the latency between the client and the server, you'd instead have latency between the two servers.


Huh? The game server instances would still need to sync between themselves, so this really wouldn't address latency.

No. If you design the game properly, they actually don't have to coordinate very much at all.


For what kind of game? For something like a FPS this requires significant coordination.


How much coordination is there between, say, different Team Fortress 2 servers?


There are no popularity restrictions and all kinds of files are allowed, including JavaScript libraries, jQuery plugins, CSS frameworks, fonts and more

So what is 'Js' about it? Has 'i' prefix become old-fashioned?


jsDelivr is javascript centered. But it allows all kinds of projects too. For example the combination feature will only work with js projects. And we have all kinds of plugins for js developers.

But we still want to help open source projects to take advantage of our network. This is why we allow even windows binaries and other stuff too.


The 'i' prefix would risk unwanted attention from Apple.


I am worrid about wb sits with misspelld nams.

Mistakn usrs somtims correctd missd lettrs, so thy downloadd pags from unrelatd servrs.

Howevr, the spc thy postd mentiond othr use cass, and presentd exampls of anothr systm with enterprisy qualitis to delivr usrs to correctd wb sits with misspelld nams:

A clevr browsr evn maks guesss, which enabls fils with unrequestd but correctd nams to be downloadd in paralll ovr integratd distributd per to per technologis, oftn receivd replis cachd, and latr presentd.

Ys, yt again, browsrs designd by committs enabld Opn Sourcs and Fre Softwars to answr anothr complicatd Internt problm!


Neat, but I couldn't get three librairies with minified JS:

    https://cdn.jsdelivr.net/g/jquery@1.10.2,bootstrap@3.0.3,angularjs@1.2.10
And I couldn't get Boostrap CSS even if they say they also host CSS.

Also, it's not clear if you can or not mix CSS and JS in the query, that would be neat to do that and then to insert CSS with Javascript (or maybe that would be a bad idea).


jQuery was deployed from the official source. And it does load a map file suggesting its minified. Not sure why it looks like this.

You can mix CSS and JS in one query but its up to you to make it work.

To load Bootstrap CSS you need to specify the exact file like so

https://cdn.jsdelivr.net/g/bootstrap@3.1(css/bootstrap.min.c...

Because the main file for this project points to a js


Thanks, I didn't see the files were in sub-directories for Bootstrap.

Here is my final Javascript query: https://cdn.jsdelivr.net/g/jquery@1.10.2(jquery.min.js),boot...


You dont need to specify the exact files in this case. This code will load the exact same thing https://cdn.jsdelivr.net/g/jquery@1.10.2,bootstrap@3.0.3,ang...

You need to specify the files if you are loading files other than the one specified in the "mainfile" parameter. Or loading multiple files together.


The original file was not minified: https://github.com/jsdelivr/jsdelivr/pull/478


I currently use cdnjs, but very interested in the single HTTP request & simpler versioning that JsDelivr is offering, so will start using this for my current project.

Most of the libraries I use are already on there, and I like the fact that brief glance information is shown for each library. Also noticed there's a duplicate misspelt typeahead library listed.


I'd like something like this integrated with a module system. For example, I could just say "var jquery = require("jquery")", and the script would be automatically downloaded from the CDN.


I suppose someone could write something like this on top of the API, https://github.com/jsdelivr/api .


[browserify-cdn](http://wzrd.in/) and http://requirebin.com/ are heading this direction.


isn't that what google does? https://developers.google.com/loader/


I must say that the process for adding files is too complicated and time consuming.


I know. I hope the auto-update app will fix this.

We are also thinking new ways to add files https://github.com/jsdelivr/jsdelivr/issues/347


Why not implement origin pull?


Origin pull from Github?

We need to maintain our own structure to allow features like file grouping and version aliasing.


HOw is this different from cdn.js? I already use that, and it too is on Cloudfare.


2004 calld. Thy want their missing "e" back.


You can use this domain if you prefer http://www.jsdeliver.com/ :)


Ys, I prefr.


How about not leaking to third-parties? I value my privacy and I wish Mozilla would too.


Sorry, I don't get the comment.

Leaking what? Your publicly available JS libraries? or is this a more general Mozilla bashing comment?

The service itself looks great, and the versioning is exactly what I'm after.


Users requesting files from CDN leak metadata.

I guess, browsers could provide a separate cache for CDN domains that's _really_ long lived (there's a certain guarantee that files won't change there) and also send requests there with no referer at all times.


Sorry, I thought it was obvious.

Requesting assets from third-party websites leaks that I have been visiting your site to the third-party. Now imagine having a big CDN that many websites use and you can spy on a lot of unsuspecting users.


Surely the correct solution to that is to control the information your browser will send to the CDN, not to try and stop web designers from using CDNs for everyone. The way requesting assets leaks the page you're visiting is through the referer header, so if you want to prevent that, use an add-on that lets you control that header, like https://addons.mozilla.org/en-US/firefox/addon/referrer-cont... .


Your point is well-taken, but I suspect 'aw3c2 might be concerned about the CDN "fingerprinting" multiple-resource requests (so that a particular exact combination of files is known to be required by a small [possibly singular] set of sites), which isn't addressed by your referrer suggestions.


"Surely the way to protect your passwords is to use different passwords on multiple websites; ergo I don't need to hash my password database".

Web developers have a responsibility to protect their users from threats they don't even know exist.


I don't think making the entire toolchain more complex with extra edge cases that would need some kind of centrally managed white list, is more correct than 'just dont do that'.


Just don't do what? Receive the information you are sending them?

If you don't want someone to have this information, the sensible thing is not to offer it. I cannot conceive of any way in which sending the information and then yelling at the recipient for receiving it is "more correct" than just not telling people things you don't want them to know.


Why don't site operators take responsibility for themselves to not leak their users information unnecessarily? We've taken it upon ourselves to chastise any web property that doesn't properly hash their passwords--we could just have easily say "if you don't want your password to other sites leaked, use different passwords". But we recognize the unfair burden we are placing on end users in that case. The case with CDNs and information leak is similar.


Robert here, Editor of Mozilla Hacks. We cover all kinds of resources for developers, especially with a strong focus on open source and technologies. If you are interested in knowing more about privacy, please get in touch with the people behind jsDelivr, write a comment to the blog post or similar. Thanks.


Hi Robert, sorry I did not realise that this was not a Mozilla project. The post sounds like it is and the site suggests it is a "hack" by Mozilla.


No worries. The name of the web site is Mozilla Hacks, and we talk about everything open source/Open Web. I've added a note at the beginning of the post to be more clear.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: