Hacker News new | past | comments | ask | show | jobs | submit login
250kb Club (250kb.club)
40 points by lcnmrn on Nov 22, 2020 | hide | past | favorite | 51 comments



This is getting a bit out of control. I'd like stringent requirements for content/size ratio. I could put up a bare html page and claim membership to this so called elite club!?

We can do better and set the standards high.


The author should allow visitors to vote on the available sites. Then the sites can not only be ranked by size but also by value.

It doesn't make sense to call it a 250kb club if it is all about minimal size. Compared to the 1MB club, there is no difference in content because both lists have the same minimal sites at the top.

It is only a 250kb club if a 240kb site is treated equal to a 10kb site. Otherwise, it's a size contest with a 250kb cut off.


A good way to go about this would be comparing the page's footprint in bytes with the page's text:

    w3m $SITE -dump | wc -c


Holy hell this one was fast.

http://minid.net


Indeed, it's a joy. I have a 200~300ms latency to it and load freaking fast, probably due to the website not having any javascript or images.


I think it’s because it appears fully styled in one go, nothing still spinning or moving around after you see anything.


One of the worst things you can do to your web re responsivity is: ads. Bloated ads downloaded from overloaded ad servers may take ten times as much bandwidth and time than whatever the visitors came to see.

The whole model of having content free but showing ads to visitors has many problems and this is one of them. Substack-like models seem to address this.


> The whole model of having content free but showing ads to visitors has many problems and this is one of them.

Many problems indeed, but also advantages. The model is not new. Broad distribution of newspaper was only possible after publishers realized they could harvest attention at low prices and resell it to advertisers. Before newspaper were something only for the elites.


The second worst thing you can do is analytics.


The third is Javascript.


Often all three show up together!


> The whole model of having content free but showing ads to visitors has many problems and this is one of them.

"Content behind paywall, didn't read." must be in the top 5 complaints about posts surfaced here on HN.


I feel like there was a missed opportunity for the '250kb.club' site itself to be hundreds of megabytes filled with react dev dependencies and other useless things.


Oh, your all text no image blog doesn't exceed 250kb?! No way!! \s

Some websites rely on images, they could never really be under 250kb. And in my experience the download size doesn't matter. What matters is the usability of the website. Something Google tries to capture with their new metrics for pagespeed. If the website is optimized nicely (e.g. lazyloading) I can use the website in under a second while it will continues to load stuff as I use it.


Most websites, however, host stupidly large uncompressed images that are multiple MB in size. Just by simple compression and appropriate file formats, most of this is solved (but a lot of developers do not seem to care about it).


I would guess that at least some of those developers are opposed to lossy compression or using whatever format Chrome supports this week that gives a whatever percent savings. The rest are probably unaware of the issue.


I wonder why not using jpeg on the front and then serving the full resolution picture when clicked is not more prominent. With all tooling we nowadays have it should be a no-brainer for almost every kind of website.


Yes but even those will usually not fit in the 250kb or 1mb limit. Especially if your homepage is a list of posts and not just links to them.

That being said, nobody is forbidding such pages to make their own club.


If you have a list of posts, using low resolution thumbnails can be as low as 20-40kb a piece. Sure, if you’re going to have a lot of them you’re not going to get them under 250kb, but still for some reason a lot people serve the high quality photo when a thumbnail suffices.


Anything I go to for information will always be more usable if it is faster. If I need to download a large image I will, but that should be a separate download step.

250kb should allow you to use a couple webp images to illustrate your article or show of your product.


Thing is, plenty of "all text, no image" blogs far exceed that because they completely unnecessarily load bizarre amounts of CSS/JS framework code they often don't even use.


I propose that someone starts a 'nojavascript.club'


I find that javascript is fine when it adds features, but it should not break a webpage if disabled, or hard-require execution of js to render simple text and images, something the web’s been good at for decades.

My own site is like that. It has javascript that runs if you use such things, but it still loads and works great even if you don’t.


I agree. JS is good for things like nicer footnotes.


A non-exhaustive list of good practices for webpages that focus on text:

- Final page weight under 128kb, without compression

- Works in Lynx, w3m, Elinks, Netsurf, Dillo, and most HTML-to-markdown converters

- No scripts or interactivity (preferably enforced at the CSP level)

- No cookies

- No animations

- No fonts–local or remote–besides sans-serif and monospace.

- No referrers

- No requests after the page finishes loading

- No 3rd-party resources (preferably enforced at the CSP level)

- Supports dark mode and/or works with most “dark mode” browser addons

I practice all of the above on my website, and encourage everyone to do the same on theirs.

Also, literally every site on Gemini/Gopher has all of the features above (except "browser support" applies to Gemini/Gopher clients).



Yeah, a bit ironic that a site lamenting the bloated web is dependent on javascript. But I appreciate the overall sentiment.


JavaScript itself isn't even that bad. Somehow people still manage to blow it up beyond any reason.

This site here uses JavaScript, but so little it's hard to even notice it.


I agree with the idea. I've expressed my fanatical hatred towards the modern web. And I'm not sure when or how it happened. Every website I open comes with a "compressed" javascript which is several megabytes. And yes, having a good internet is a given in my case and it has nothing to do with that. Truthfully I have no issues with images on the web that are several megabytes. A good comparison in my view is the nautical mile vs statute mile: they are both called "miles" and they both measure distance, but this is where the similarities end. All that javascript(regardless of which "modern framework" we are talking about) eats your CPU time for the sake of flashiness and makes no real contribution. All the spinners and preloaders these days have less to do with transitioning effects and more to do with hiding the bloat that the web has become. I'd be much happier to see a blank screen for 20 milliseconds and fully rendered content a few milliseconds later, rather than a grey page, with 20 different empty containers, filled with preloaders and waiting for 10 seconds for each of them to load, while that eats up one of the cores of my CPU. While I abhor javascript as a language, I don't mind it's usage to some minimal extent but at this point, if you are using anything less than an 8-th gen i7 with at least 8gb of ram, the web is pretty much unusable. That is absurd considering that the web, by design, was meant to be a simple and fast way of transferring documents. Hence the reason why I ended up making this[1]. Now that I opened it, I realize that I've messed up the favicon and it's larger than the rest of the content. Apart from that, this does serve the purpose of <250kb with moderate interactivity...

[1] https://rorigami.site/


Question about these sites: are they 250 KB for the entire site (that is: all the pages, images, assets combined) or 250 KB for just the site’s homepage? Because the former basically rules out any blog with enough content or a couple of images, and the latter seems almost trivial to achieve, maybe even if you have a picture of some sort on that page…


Based on the doc:

> Websites listed here are downloaded and analyzed with Phantomas. The total weight is counted and then the size of actual content is measured and shown as a ratio.

> For example: If a website has a total weight of 100kb and 60kb are the documents structure, text, images, videos and so on, then the content ratio is 60%. The rest are extras like CSS, JavaScript and so on. It is hard to say what a good ratio is but my gut feeling is that everything above 20% is pretty good already.


Right, but this doesn't tell me what "total weight" is.


It only loads the URL specified.


Good question. It also means that these sites need to be continuously reviewed, e.g., updating the site with new content may push it over the limit.


From a bit of experimenting, I don't even know what it's counting. I thought it was total transmitted bytes, since the numbers seemed pretty low…but then I checked and danluu's site is like 20 KB on my computer because of Google Analytics but it's listed as 3 (TIL it's five times heavier than my site, though I write much less: https://saagarjha.com/). Anyone know what is going on here? Is it "transferred bytes with JavaScript off", which is a rather strange metric?


The only interpretation that makes sense to me is 250 kB per page with loaded assets.

Whereas interpreting it as the sum of all pages on the site, or only the homepage of the site, neither of those make sense to me at all.


What's the number next to the website?


The top site ironically abuses the standards. It entirely omits the HTML doctype, head, body tags. They can do it because browsers are correcting developer errors, but is this something that should be advised to save a few bytes?

(no)


Creating things within size limitations is cool, but it's entirely possible to create a bloated website experience within the 250kb or 1MB limit.

I think there's more to "bloat" than file-size!


Sure, but usually file-size correlates with a ”bloat”.


That would be a great website.

cantbeliveitsbloated.com sites under 250kb, that load slower than sites of 2MB.


while (1) { document.getElementById('pcontainer').innerHTML = '<p>new paragraph</p>' + document.getElementById('pcontainer').innerHTML; }


What’s next “nowebsite.club”? ;)


One of the businesses I founded and run has a one page, black-on-white (probably ~10kB?) website that says “<companyname> does not have a website.” :)


https://www.berkshirehathaway.com/ 8k at pretty close.

That Geico gif is a whopping 1.1kb can probably optimize that into a SVG or something...


The Gemini and Gopher communities welcome you!


nobrowser.club? nointernet.club?...


no.club


Can we have a 10kb Club next?


Can we auto-generate these clubs? One could probably set up a wildcard CNAME record for a domain and then the subdomain (from the HTTP Host header) is parsed as the size limit (e.g: `10kb.website.club`, `100b.website.club`)


That’s a challenge! I was going to propose a 100kb club first, but we might as well skip that I guess.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: