Hacker News new | past | comments | ask | show | jobs | submit login
The 512KB Club (512kb.club)
19 points by kevq on Dec 16, 2020 | hide | past | favorite | 20 comments



After 1mb.club a week or two ago, should I just beat the rush and register 1kb.club?


Right, that was here: https://news.ycombinator.com/item?id=25151773

There's a power law dropoff in how interesting things like this are as they get followup/copycat treatment (https://hn.algolia.com/?query=follow-up%20by%3Adang&dateRang...). It's a bit like telling the same joke several times in a row.


Big Laugh here.. meanwhile I just can't wait for the 1byte.club this will be quite an achievement.


!


Sorry, ruined by the intra-comment span surrounding the exclamation point.


I feel this site shows why websites need more than 512kb these days... Most of these are just unaesthetic, simple personal sites with a few blog posts. Nothing wrong with that but nothing to glorify either.


512kb is still a lot of space to play with. My website - the one I live from - is about half of that, except on longer articles. That's despite large header images and quality-of-life JavaScript. It wasn't very hard to keep it that way.


> Most of these are just unaesthetic

Huh, I didn't think it costs kilobytes to buy aesthetics.


Custom fonts, images that look nice on retina / hdpi screens, complex responsive layouts can all add up. Images are obviously the worst offender, but the rest all contribute.

Of course, aesthetics are a matter of subjectivity, so what is pleasing to one might be gaudy or bare to another. It is largely a matter of picking your audience, I think.


If the site uses good CSS & SVG graphics, it would still be possible to get nice layouts and high res graphics without bloat.


Agreed, this site needs tags like "only text", "app", "simple blog", "list of links" ...

Clicked on about 4 pages and all were "Hi this is a simple site of links I like!" type pages.


I agree with the general sentiment about size, but page size isn't everything; it's really more what that size is contributing towards. If a multi-megabyte page consists of many high-resolution images, that clearly gives far more value to the reader than if the same amount of data was spent on tons of JS of which next to none actually gets executed. Likewise, the huge amount of video data on a YouTube page is of definite value to the viewer (perhaps excepting the ads).


The Guardian is 4MB because the front page has a dozen+ hi-res images. Turn them off if you want to optimize for size. Not every site has to be a static wall of text.

And why is 512KB acceptable but 4MB a problem? Yes the web is a bloated mess, but there's no point drawing random lines on meaningless metrics. Just focus on providing a good user experience. A few bytes of bad JavaScript can mess up your site's performance, while several well-optimized MBs can make it accessible and useful.


> The Guardian is 4MB because the front page has a dozen+ hi-res images.

That's incorrect. If you analyse the structure of the site (link below), ~800KB of the 4.4MB uncompressed homepage is images. 1.3MB is JavaScript, another 1.2MB is HTML (how is that even possible?)

https://gtmetrix.com/reports/www.theguardian.com/xSnIUn3X/

> And why is 512KB acceptable but 4MB a problem?

Um, because it's an 8th of the size and there are still many people in the world that don't have fast internet access.

> A few bytes of bad JavaScript can mess up your site's performance, while several well-optimized MBs can make it accessible and useful.

Agree with the JS, but not the images. Well omptimised imaged should hit multiple megabytes unless its some kind of gallery with many images. Using srcset to load resized images is a much better way of doing this.


Another important metric is the Time to First Byte (TTFB).

How about 10ms club?


TTFB is pointless if you throw a cookie notice, a newsletter prompt and a bunch of ads in your users' face. It's not much better if you hide the answers your users come for in a sea of SEO keyword spam. You just annoy your users really efficiently.

I'd rather wait 5 seconds for a straight answer than deal with a really fast, but really annoying website.


What site achieves 10ms? Even HN is 70ms for me when I disable caching. And that's ignoring the HTTPS handshake.


It would be better to evaluate the sites based on the size of non-image resources. E.g, load the site but discard all image files when computing the size so only text, css, js, etc are counted. Then add the images in separately. That would allow sites with useful images like a photo gallery to still be listed on this page. That would still endorse well designed sites that load the text & CSS instantly so that the reader doesn't need to wait for image loading.


OK. The Guardian's uncompressed homepage with images removed is 3.6 MB. That's still way too big.

https://gtmetrix.com/reports/www.theguardian.com/xSnIUn3X/


I think there also needs to another metric like Alexa rank (or something more reliable) which shows the traffic/popularity of the website wrt to its size.

That would be more interesting to me.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: