This site explains how to do web development without any tools or frameworks, relying only on the browser and web standards. I made it because I couldn’t find a single place online that put together all the right information in the way that I liked. The site itself is of course also vanilla and open sourced on github.
I've been thinking about doing something like this for the past few months, kudos for giving it a shot!
I disagree with your statement: "Don't use the plain vanilla approach until you've tried some of the popular web development frameworks and have learned why you might want to do without, and until you're confident on how to structure a codebase without the help of a framework."
I was writing vanilla javascript for years before I began using frameworks, so when I encounter a new one, I understand exactly what problem it's trying to solve. For any beginners, I personally recommend just building websites directly, without any framework assistance.
It is a tricky balance to be sure. Building familiarity with html, css and javascript first is necessary, and I do link out to tutorials that teach that on the welcome page. But beyond the basic knowledge, how do you learn good practices quickest? Do you start trying things and learn what works and what doesn’t through experience? Or do you follow the learning track of popular frameworks and learn good practices through example and osmosis?
On balance I think it’s better to learn how to build with frameworks first to learn all the right concepts in the guiding structure that they provide, and only then to walk away from frameworks and have complete freedom by going vanilla. That’s why that paragraph is there. YMMV.
Critic time. This site is more enterprise-vanilla. Like many others you have fallen to the trap of using any dom api feature, without stopping for a second and thinking, "Do I really need this? Does it actually make my life/code easier?" The answer is no. 26 lines of code to add a single avatar image with alt text? This should have been a 1 liner.
You don't have to use every feature out there. Getters that return 1 element that will never change. Splitting what should have been 1 style.css into 5 different files. Too many "best practices" for "complex" software.
I've actually been struggling with a problem related to this. First time page loads if you have a 2000loc JavaScript file + index + css + favicon requests nothing but those 4 resources, which is very quick on a keep-alive http 1.1 server.
I've written all of that from scratch because I got tired of maintaining node.js
But when splitting up the js file into pieces and using es6 modules, say 12 different files, Chrome makes 8 TCP connections on 8 different sockets, and each connection has its own TCP handshake (and TLS handshake for https). How do you bundle things without using a build system or a bundler? Import maps help, and it's not difficult to simply hash the page and copy the asset to a "dist/" folder with the hash appended, but it's still slow on first page load.
I'm not a web developer professionally (or network engineer), so I'm learning about web networking myself for the first time. It might be helpful to add a section about "traffic shaping"? I've smacked together a service worker that does the work of caching well-enough for now, but I'm definitely doing something rather strange and reinventing something. My page loaded significantly faster when it was just 1 JS file, no caching needed.
The number of connections is a bit of a red herring here, the problem is (typically) that the browser loads one module which then tells it to load another module, and so on. Each round-trip is wasted time. Preloading gives the browser a flat list of all required dependencies right away. By the way, this applies to everything else: having all required resources declared at the top of the page makes things faster. You can even preload some less-obvious things like background images referenced by CSS files.
This might achieve the "bundling" you want, in the sense that all the preloaded resources can be multiplexed into a single connection. But again, the number of connections is almost nothing compared to the number of round-trips required.
This is my understanding of HTTP versions and when bundling is necessary:
On HTTP 1 chrome will make 6 TCP connections to a host, and serialize requests one by one over those connections. This suffers from the waterfall effect, where you have to wait for the first 6 requests to complete, then the next batch, and so on. On lossy connections this can also lead to head of line blocking on each of those connections, where it has to wait for a packet of one file to arrive before it "frees up" the connection for the next request. On HTTP 1 bundling becomes necessary pretty quickly to guarantee good performance.
On HTTP 2 it will make 1 TCP connection to a host, and multiplex requests over those connections, so they may all download in intermixed chunks. This has less connection negotiation overhead, and it does not suffer from the waterfall effect to the same degree. However, it still has head of line blocking on lossy connections, and this is in fact made worse because there is only one connection so if it's blocked at TCP level on a single packet of a single file every request behind of that packet is blocked as well. I've done some tests and on reasonable quality connections there is not much overhead involved with requesting a hundred files instead of one file. The caveat is "on reasonable quality connections".
On HTTP 3 the QUIC protocol is based on UDP and therefore connectionless. There is no overhead for establishing connections or doing TLS handshakes, and no waterfall or head of line blocking. However, because of the connectionless protocol a lot of TCP concerns (like dealing with packet loss) become a concern of the HTTP layer, and this complicates implementation. The browsers already support it, but it is especially an issue at the web server level. This means it will take a while for this feature to roll out to servers everywhere. Once the web moves over to HTTP 3 the performance advantages of bundling should largely disappear.
A service worker and/or careful use of caching can be used as a workaround to lessen the impact of requesting many files over HTTP 1, but this adds implementation complexity in the application and a bug may cause clients to end up in a semi-permanently broken state.
HTTP 2 and above will use one connection to retrieve several files.
Caddy [1] can act as a static file server that will default to HTTP 2 if all parties support it. No configuration required.
If you allow UDP connections in your firewall, it will upgrade to HTTP 3 automagically as well.
I'm totally into vanilla web development. Almost all the frameworks are either trying to solve the problems caused by the browser, or trying to force developers follow their style so that the developers make less mistakes.
But the browser evolves:
- Convenient and flexible browser APIs totally replaced jQuery
- HTTP2 HPACK, multiplexing and connection coalescing make SPA and bundlers unnecessary
- Web Component and Shadow DOMs helps you write modular HTML and CSS
etc etc
And developer needs to evolve, too! These frameworks make too many leaky abstractions that developers should think about. It turns out if they don't encounter problems at the beginning, they will meet them later with interest.
I was going to write a similar document, but thankfully you write it so well!
P.S. I'm also not against the framework to solve engineering problems like testing or CI/CD.
Consider adding a track where application is built without JavaScript. There have been many interesting development on HTML side, with dialog and popover elements, making no-JS websites viable and even more convenient to use.
Nice work, I haven't tried web components but support is better than I expected per caniuse[0]. This weirdly reminded me of a time 12 years ago[1] when I was encouraged to avoid using the term "vanilla" as it sounds like another framework
I have noticed the events complexity, and I have actually developed an easy workaround for this. The file on this website is a bit outdated and has a bunch of bugs, but the demos work: https://js-effects.netlify.app/todo
Love it! Have you seen the signals proposal? Just learned no-build-step preact, but between signal and these examples, I might not bother for my next project.
I think your adblocker is injecting this into the html files that are requested from the server for showing in the code viewer. I don't know of a way to prevent it from doing this.
This is so damn cool. Thank you for building this.
I last did hands-on (professional) development back in the jquery era, but have amassed a certain degree of familiarity with modern frameworks by contracting / PM-ing various projects others chose to build with them.
To be honest, I'd no idea how far "vanilla" capabilities had progressed. My last couple of personal projects I built with jquery, because I still know it, a framework would be way overkill, and, you know, why the hell not? Your presentation has me excited to learn something new. Again, thanks.
I disagree with your statement: "Don't use the plain vanilla approach until you've tried some of the popular web development frameworks and have learned why you might want to do without, and until you're confident on how to structure a codebase without the help of a framework."
I was writing vanilla javascript for years before I began using frameworks, so when I encounter a new one, I understand exactly what problem it's trying to solve. For any beginners, I personally recommend just building websites directly, without any framework assistance.