
Pika CDN – A CDN for Modern JavaScript - vvoyer
https://www.pika.dev/cdn
======
devalnor
Very nice service but how do you manage the CDN subresource integrity (SRI) if
the packages are different for each individual user?

It's possible to check subresource with es6 module but only if you know the
signature first.([https://stackoverflow.com/questions/45804660/is-it-
possible-...](https://stackoverflow.com/questions/45804660/is-it-possible-to-
use-subresource-integrity-with-es6-module-imports)).

Even Webpack will not handle it with webpack-subresource-integrity
([https://www.npmjs.com/package/webpack-subresource-
integrity](https://www.npmjs.com/package/webpack-subresource-integrity))

Of course HTTPS is strong but not a foolproof solution against man-in-the-
middle attack.

~~~
tenaciousDaniel
I could probably google this but I'm not well versed in security stuff. What
would a MITM attack against HTTPS look like?

~~~
colde
There would be a number of ways to do this:

\- Strip SSL by for instance blocking port 443 and hoping they fall back to
HTTP.

\- Get your own root certificate installed on the equipment of the user you
are attacking. This is fairly common in corporate environments for instance.

\- MD5 collision attacks (although almost every certificate would be SHA
signed these days)

~~~
allset_
HSTS prevents the first of these if the client has connected to the server
previously.

Chrome also hasn't trusted certs with MD5 since version 65.

~~~
snug
HSTS Preload prevents that from happening even if they never visited the site

------
no_wizard
My company would buy into this if there were some kind of boilerplate contract
you sold to a business/institution vs the Patreon page you have right now.

Just FYI, you're missing out on some dollars because of that. For better or
worse, the bean counters at my work place won't approve anything less. I have
a feeling I'm not alone.

If you can quickly whip up some boilerplate business checkout with an invoice,
you'd make more than a few dollars today.

~~~
Spivak
Forget the bean counters, the technical people _shouldn 't_ approve of this
without some sort of SLA or business agreement. How would you feel comfortable
depending on a 3rd party service you have no actual business relationship or
SLA guarantees with?

They're seriously leaving money on the table, businesses would have no problem
dropping 50k/yr on a service like this.

~~~
no_wizard
Thats a good point, actually.

This is something we've been looking for at my job, but we don't have the
technical expertise to do it ourselves (a CDN is tricky business, not in our
core competency). If an SLA that included support and/or customization as well
as had a direct line to high level support for feedback, you could easily net
50K or more.

Seriously. I know my organization would be willing to pay even more than that.
If you're reading this Pika founders, you should really give this some
thought.

------
mrspeaker
This is fantastic - exactly what I've been looking for but didn't know I was
looking for it! I've been re-importing libraries by doing something weird
like: `import './three.js'; export default window.Three;` so I can use it as a
normal module.

I love not having to use build tools for my personal projects anymore -
everything feels so light and "old school". Here's my Minecraft-ish clone in
native modules and WebGL2:
[https://github.com/mrspeaker/webgl2-voxels](https://github.com/mrspeaker/webgl2-voxels).
No dot files, nothin' to build... just view source!

~~~
Ruphin
That's a really neat project! Thanks for sharing

~~~
MentallyRetired
Indeed, thumbs up. This should be a post of its own.

------
tthisk
This is very good development for frontend development. Build systems like
webpack were useful technology in earlier days. But they are presenting a big
hurdle for newer less experienced developers to enter the frontend development
space today. I would love to see a future where we can again run a webserver
from a folder to serve a frontend in development.

I do wonder how modular css fits into the picture of es modules though.

~~~
giancarlostoro
I have been coding frontend, backend and other system things since at least
2007 and the state of web development scares me. I shouldnt need a build tool
to get JS on a website. I should be able to just run JS on a website. The web
went from extremely simple and usable to "oh my what the hell is this?" It
gets worse when you code frontend on proprietary systems that are hard to
extend. Ever had to override something in Bootstrap with your own CSS files?

~~~
asdkhadsj
I mean, you don't have to use it, right?

I think the reason it is like that is because there were _many_ problems with
JS "back in the day", and so people felt they had to come up with solutions.

Just look at Svelte. It's a library about compiling JS to get back to the
"just JS" days. I mean, it's more than that, but it's about reducing runtime
complexity. So another way to think about it is that writing "old style simple
JS" is so convoluted that the author felt the need to write a translation
layer from modern frameworks to old style JS. Sort of a mindblower to me haha
_(though, I love and agree with Svelte, to be clear)_.

The great thing though is that you can still use plain old JS, right? Nothing
has changed for you if you don't want it. So is there really a problem?

This "modern web" stuff is just people solving problems. Some of these
problems are the fault of old JS/web. Some of them are problems of our own
making. Remember how amazing modern UI frameworks were? It's because we had
PTSD from horrible jQuery codebases. A problem of our own making.

So stick with what you like, and other people can use the more complex stuff.
It's a win win, no?

~~~
z3t4
I did classic server side rendered apps, but experimented with pure JS apps
early on where the only way to get persistence was to store things in cookies.
The "app" was just a html file on the desktop that you double clicked on to
open in the browser. Then came AJAX. and during the last fifteen years more
and more pieces have fallen into place to make web apps more viable, like
service workers, local storage, add to desktop, etc. But JS apps are much
harder to develop compared to server rendered app, because you have to manage
state, while server rendered apps are just a snapshot of the database.

------
youngtaff
If you've got a site with decent levels of traffic host libraries yourself
rather than use a JS CDN.

Retrieving critical content from a 3rd-party CDN has a number of issues:

\- New TCP connection has to be created with added cost of TLS negotiation and
it's own slow-start phase

\- If you're using HTTP/2 then prioritisation only occurs over a single
connection so it can't be prioritised against other content

~~~
anderspitman
There are tradeoffs either way. For popular modules it might already be in
your browser cache if another app used the module recently. Plus being a CDN
it's probably being served by a machine closer to the user. And if your whole
site is running of a CDN then you're relying on a 3rd party anyway.

Benchmark for your users.

~~~
youngtaff
Based on [https://andydavies.me/blog/2018/09/06/safari-caching-
and-3rd...](https://andydavies.me/blog/2018/09/06/safari-caching-and-3rd-
party-resources/) it seems highly unlikely that a popular module will be in
the cache

\- Safari double keys it's cache to prevent 3rd-parties tracking across sites
\- Usage of common libraries is just too low for their to be a critical mass

A whole site running on a CDN still involves only one connection, will make
use of the throughput growing the TCP congestion window grows, and a decent
CDN is likely to be more reliable than the origin

------
unilynx
`curl -i [https://cdn.pika.dev/preact`](https://cdn.pika.dev/preact`)
redirects me to a dist-es2019 package (I assume because it detects my user-
agent supporting that) but isn't showing anything like a `Vary: User-Agent`
header.

Won't this break for any situation in which users with different browsers
share a proxy server?

(also tried with Chrome, didn't see a Vary there either)

~~~
e12e
Don't worry - between tls and http2 you don't get to use a proxy anymore
anyway... :/

------
bufferoverflow
So how is that different from any other CDN?

CDNJS is actually orders of magnitude more likely to have a cache hit than
this new offering.

~~~
manigandham
It describes it in the first paragraph. It acts as a proxy to serve ESM JS
files (for any top-level package that's already in ESM syntax) that modern
browsers can use natively without a build/transpile step. Looks like they
recently added automatic polyfills for older browsers too.

CDNJS is just a standard CDN that serves up files as they're packaged, but if
you want a high hit-ratio then [https://jsDelivr.com](https://jsDelivr.com)
has the most marketshare currently, and more features.

~~~
bufferoverflow
"CDNJS" returns 5X more results in Google compared to "jsDelivr", so you're
more likely to get a cache hit.

Are you sure it has the most marketshare?

~~~
manigandham
A google search for the name has nothing to do with network coverage and
volume of the CDN traffic.

CDNJS is only using Cloudflare, JsDelivr has a bigger network with more
partners and support for both npm and github.

~~~
bufferoverflow
Google search tells you how popular it is, and that's what matters when
getting a cache hit. Network coverage doesn't affect that.

You can also try Google Trends, which tells you CDNJS is around twice as
popular:

[https://trends.google.com/trends/explore?geo=US&q=CDNJS,jsde...](https://trends.google.com/trends/explore?geo=US&q=CDNJS,jsdelivr)

~~~
penagwin
Google search can only tell you how popular something is to search and how
much content there is _about it_. They aren't going to be showing you a
website just because there's a src="cdn.example.com" in the code, only if it's
in the text.

To actually know then you either need them to report their # of users and
trust them, or scrape tons of websites and check their source for which CDN
they use.

~~~
bufferoverflow
You can also search Github.

CDNJS: 18M references

jsDelivr: 1M references

I showed you 3 different sources that prove CDNJS is much more popular. You
showed zero so far.

~~~
IanCal
You may be right that CDNJS is more likely to result in cache hits, it's just
that _none_ of these figures actually help in finding out the answer.

CDNJS being in 1000 small github blogs could easily be less impactful than a
single large website using jsDelivr. We have no idea if the github projects
are even used.

Again, it may well be the case that it's better in this way, but these figures
show nothing really either way.

Here's a bit of an attempt at looking more at it, though I don't know their
methodology:

[https://w3techs.com/technologies/details/cd-
jsdelivr/all/all](https://w3techs.com/technologies/details/cd-
jsdelivr/all/all)

[https://w3techs.com/technologies/details/cd-
cdnjs/all/all](https://w3techs.com/technologies/details/cd-cdnjs/all/all)

Ones that jump out there are dailymail and yelp, depending on your expected
users you might expect one or the other does better for you.

~~~
bufferoverflow
These links also support my claim. Again, you posted zero evidence to the
contrary.

~~~
manigandham
The thread is about cache hit ratio which is not that simple. A file is either
cached at an endpoint or not, regardless of whether it's downloaded once or a
billion times. CDNJS only supports ~3k libraries and gets most of its usage
from jQuery and FontAwesome.

jsDelivr has an automated backend proxy to support any NPM package, Github
repo, or Wordpress.org plugin. It also uses Cloudflare as one of its backends,
so at worst it's at parity with CDNJS in cache hits or far better due to more
network partners, more global regions, and more packages from more origins.

Anything on CDNJS is also likely cached by jsDelivr, but most of everything
cached on jsDelivr is not even available on CDNJS.

~~~
IanCal
The other side though is if something is cached at the client, right? I
thought that was what people were talking about with cache hits.

------
santialbo
Does it inspect your code for potentially needed polyfills or do you need to
specify a list of polyfillable features that you have used?

~~~
JasonSage
It inspects the code for the package a request was made for and polyfills
features it used. You get the module + polyfills in one response.

------
raxxorrax
The differential serving sounds like a neat idea. Naturally, everyone not
using the newest version of Firefox or Safari will go to hell eventually, but
until then it could really improve the web for a lot of people.

------
anderspitman
This is way cool. I recently started a new app and decided to see how far I
could get without a build tool. My early impressions left me wanting to write
a blog post "ES Modules Make JavaScript Fun Again." The whole development
cycle felt clean and simple. Ultimately though I got hung up on dependencies.
For a while I was just including things directly from node_modules/. But npm
flattens things so that library location is not predictable (this crops up
when en ES module dependency tries to look in its own node_modules/ directory
for another ES module dependency, but that dependency has actually been
flattened to the top level). So you're basically stuck downloading all your
dependencies (and their dependencies) manually. This isn't 100% a bad thing.
It pushes you to use smaller dependencies with fewer sub-dependencies. You're
also stuck using libraries that export an ES module. Pika could be just the
ticket to bridge these gaps.

------
cjblomqvist
How many bytes are typically spent on compatibility with older browsers? Have
anyone made any research into this?

~~~
manigandham
It varies a lot depending on the browser features you're using, total script
size, minification vs gzip compression, cache hit rate, etc.

My company is in adtech so our final bundles are 14kb (single TCP congestion
window) for modern browsers, 30kb for Safari/iOS 10, and 75kb for IE11/IE10.
We've seen similar doubling-of-size in other libraries for backwards
compatibility, although we can probably drop IE10 soon and cut IE11 down by
half.

------
ollerac
This wouldn't work with a standard React project though, right? Because you
still need to transpile JSX. You could use the development version of React, I
guess, which is slower, but can understand JSX, but that's not something you
want to ship.

I'd love to use something like this for teaching, tutorials, and even small
projects, but there's some things I still need a transpiler for.

I also realize I could use the `htm` package instead of JSX, which gives a lot
of benefits over JSX, including not requiring transpiling, but, since it's not
widely used by the wider ecosystem, I'd be a little hesitant to include it in
my projects.

~~~
sdegutis
Check out
[https://www.pika.dev/packages/htm](https://www.pika.dev/packages/htm) which
solves exactly that problem. Personally I like it so far.

------
ktpsns
[https://www.pika.dev/search?q=jquery](https://www.pika.dev/search?q=jquery)
\-- so jQuery is not "modern" any more? That's quite surprising, giving for
instance the dependency of the [http://semantic-ui.com/](http://semantic-
ui.com/) framework on jQuery ([https://github.com/Semantic-Org/Semantic-
UI/issues/1175](https://github.com/Semantic-Org/Semantic-UI/issues/1175))

~~~
cdata
jQuery was originally designed and built in a bygone era (on the web / front-
end timescale), includes many features that have landed and/or normalized on
the web platform and also does not leverage the modern JavaScript module
system.

Let's call it "retro"? :)

~~~
ktpsns
Unfortunately there is still need for libraries like underscore.js, which also
covers some jQuery functions (which are not DOM-related).

------
mfer
What is the business model? Where does the money come to pay for the dev and
hosting? This is the question I'm left with.

Nothing is free and I didn't find this in crunchbase.

Something is paying for it. Is it tracking people and selling it?

~~~
eternalny1
It says it right at the bottom of the page!

> Love Pika? Go Pro! Pika CDN will always be free, but you can support the
> project with a Pro Membership donation on Patreon. Get early access to
> upcoming production-only features.

~~~
pier25
This is wishful thinking IMO.

What if in 5-10 years the volume is too big to be funded by donations? Will
Pika sell to a malicious company? Will it shut down and kill everyone that
depends on it?

~~~
true_religion
Well yes, it will probably shut down just like the official python package
repo will shut down if it’s sponsors can’t meet the budget. Nothing is
guaranteed to last forever.

~~~
pier25
The difference being apps are actually pointing to the packages in realtime.
If the CDN falls the app stops working.

------
playpause
Looks great, but I think the homepage should do more to convince me that I can
trust it. Who runs it, how is it funded, is there any guarantee they won't run
out of money and shut down, etc.

------
zimbatm
Can anyone explain how the differential serving works?

I get that they might have a User-Agent mapping to features. But how do they
know which feature are needed by the loaded modules?

~~~
atonse
Probably based on the polyfill dependencies that are included in the packages.

------
neilv
Pika CDN seems to facilitate user tracking by the CDN better than the current
JS CDNs can (with simple browser privacy features that browsers should be
doing already).

Also, wasn't clear to me whether they support SRI or an equivalent supported
by the browser. If they don't, it could also be a centralized vulnerability
for user-targeted injection.

(Solution: the best sites will pay to serve their own JS.)

------
codezero
I suggest you register all the bit-flipped domains. This is a must for all
CDNs, given the ability to serve malicious JS from a bitflipped domain.

------
wcdolphin
I love the idea of a more efficient CDN for JS (and code overall!), but it
isn’t clear to me how this handles the multitude of versions. None of the
examples seem to include versioning, which is a huge oversight IMO. A future I
see is IPFS for this sort of thing. All objects identified uniquely, but
cacheable by multiple entities.

~~~
justinrlle
Well, the first example looks like it includes a version:

    
    
        import {Component, render} from 'https://cdn.pika.dev/preact/v8';

------
z3t4
I built a repo like this but for require (commonjs), where package
dependencies was sent along the first request using http2. Only problem was
that browsers didnt cache the preloaded files and re-requestsed them.
Hopefully browsers will fix this or latency will be a huge problem with
several layers deep dependencies.

------
indigochill
Just a comment on the name: as a Python dev when I saw Pika I immediately
thought the RabbitMQ Python package:
[https://pypi.org/project/pika/](https://pypi.org/project/pika/)

May or may not be an issue for this project. Just bringing it up for
visibility.

~~~
isubasinghe
Yeah that is what came to my mind too, but the CDN part helped me realize that
this was something else.

------
itsbits
When I mention url like that in import, will bundlers like Webpack
automatically downloads the js modules?

~~~
marksomnian
From what I understand, the idea is that you don't use a bundler, but let the
browser download all the modules that your app needs - hence some features
like the "differential serving" (a.k.a. polyfills added if necessary with UA
sniffing).

------
skybrian
There's a reference to the "browser's shared module cache". Anyone know what
that is?

------
symlinkk
Their example doesn't work for me - it's just blank. Looks like CORS issues?

[https://pika-cdn-example.glitch.me/](https://pika-cdn-example.glitch.me/)

------
Something1234
Isn't pika where you're eating things you're not supposed to? So by using this
cdn your computer is eating things it's not supposed to?

------
snug
It would be great to make the packages immutable, doesn't allow the maintainer
of the package to change code on the website, malicious or not.

------
tzfld
If I would be Google, I would create a service like this, and would slowly
inject tracking code in every package served.

~~~
mrspeaker
Well, that's exactly what _every_ CDN does: if you're a CDN you don't need to
inject anything - people give away all their users' browsing histories by
making them download files from your servers... the analytics are just your
log files.

