This is getting a bit out of control. I'd like stringent requirements for content/size ratio. I could put up a bare html page and claim membership to this so called elite club!?
The author should allow visitors to vote on the available sites. Then the sites can not only be ranked by size but also by value.
It doesn't make sense to call it a 250kb club if it is all about minimal size. Compared to the 1MB club, there is no difference in content because both lists have the same minimal sites at the top.
It is only a 250kb club if a 240kb site is treated equal to a 10kb site. Otherwise, it's a size contest with a 250kb cut off.
One of the worst things you can do to your web re responsivity is: ads. Bloated ads downloaded from overloaded ad servers may take ten times as much bandwidth and time than whatever the visitors came to see.
The whole model of having content free but showing ads to visitors has many problems and this is one of them. Substack-like models seem to address this.
> The whole model of having content free but showing ads to visitors has many problems and this is one of them.
Many problems indeed, but also advantages. The model is not new. Broad distribution of newspaper was only possible after publishers realized they could harvest attention at low prices and resell it to advertisers. Before newspaper were something only for the elites.
I feel like there was a missed opportunity for the '250kb.club' site itself to be hundreds of megabytes filled with react dev dependencies and other useless things.
Oh, your all text no image blog doesn't exceed 250kb?! No way!! \s
Some websites rely on images, they could never really be under 250kb. And in my experience the download size doesn't matter. What matters is the usability of the website. Something Google tries to capture with their new metrics for pagespeed. If the website is optimized nicely (e.g. lazyloading) I can use the website in under a second while it will continues to load stuff as I use it.
Most websites, however, host stupidly large uncompressed images that are multiple MB in size. Just by simple compression and appropriate file formats, most of this is solved (but a lot of developers do not seem to care about it).
I would guess that at least some of those developers are opposed to lossy compression or using whatever format Chrome supports this week that gives a whatever percent savings. The rest are probably unaware of the issue.
I wonder why not using jpeg on the front and then serving the full resolution picture when clicked is not more prominent. With all tooling we nowadays have it should be a no-brainer for almost every kind of website.
If you have a list of posts, using low resolution thumbnails can be as low as 20-40kb a piece. Sure, if you’re going to have a lot of them you’re not going to get them under 250kb, but still for some reason a lot people serve the high quality photo when a thumbnail suffices.
Anything I go to for information will always be more usable if it is faster. If I need to download a large image I will, but that should be a separate download step.
250kb should allow you to use a couple webp images to illustrate your article or show of your product.
Thing is, plenty of "all text, no image" blogs far exceed that because they completely unnecessarily load bizarre amounts of CSS/JS framework code they often don't even use.
I find that javascript is fine when it adds features, but it should not break a webpage if disabled, or hard-require execution of js to render simple text and images, something the web’s been good at for decades.
My own site is like that. It has javascript that runs if you use such things, but it still loads and works great even if you don’t.
I agree with the idea. I've expressed my fanatical hatred towards the modern web. And I'm not sure when or how it happened. Every website I open comes with a "compressed" javascript which is several megabytes. And yes, having a good internet is a given in my case and it has nothing to do with that. Truthfully I have no issues with images on the web that are several megabytes. A good comparison in my view is the nautical mile vs statute mile: they are both called "miles" and they both measure distance, but this is where the similarities end. All that javascript(regardless of which "modern framework" we are talking about) eats your CPU time for the sake of flashiness and makes no real contribution. All the spinners and preloaders these days have less to do with transitioning effects and more to do with hiding the bloat that the web has become. I'd be much happier to see a blank screen for 20 milliseconds and fully rendered content a few milliseconds later, rather than a grey page, with 20 different empty containers, filled with preloaders and waiting for 10 seconds for each of them to load, while that eats up one of the cores of my CPU. While I abhor javascript as a language, I don't mind it's usage to some minimal extent but at this point, if you are using anything less than an 8-th gen i7 with at least 8gb of ram, the web is pretty much unusable. That is absurd considering that the web, by design, was meant to be a simple and fast way of transferring documents. Hence the reason why I ended up making this[1]. Now that I opened it, I realize that I've messed up the favicon and it's larger than the rest of the content. Apart from that, this does serve the purpose of <250kb with moderate interactivity...
Question about these sites: are they 250 KB for the entire site (that is: all the pages, images, assets combined) or 250 KB for just the site’s homepage? Because the former basically rules out any blog with enough content or a couple of images, and the latter seems almost trivial to achieve, maybe even if you have a picture of some sort on that page…
> Websites listed here are downloaded and analyzed with Phantomas. The total weight is counted and then the size of actual content is measured and shown as a ratio.
> For example: If a website has a total weight of 100kb and 60kb are the documents structure, text, images, videos and so on, then the content ratio is 60%. The rest are extras like CSS, JavaScript and so on. It is hard to say what a good ratio is but my gut feeling is that everything above 20% is pretty good already.
From a bit of experimenting, I don't even know what it's counting. I thought it was total transmitted bytes, since the numbers seemed pretty low…but then I checked and danluu's site is like 20 KB on my computer because of Google Analytics but it's listed as 3 (TIL it's five times heavier than my site, though I write much less: https://saagarjha.com/). Anyone know what is going on here? Is it "transferred bytes with JavaScript off", which is a rather strange metric?
The top site ironically abuses the standards. It entirely omits the HTML doctype, head, body tags. They can do it because browsers are correcting developer errors, but is this something that should be advised to save a few bytes?
Can we auto-generate these clubs? One could probably set up a wildcard CNAME record for a domain and then the subdomain (from the HTTP Host header) is parsed as the size limit (e.g: `10kb.website.club`, `100b.website.club`)
We can do better and set the standards high.