Hacker News new | past | comments | ask | show | jobs | submit login
10 KB Club: With links to popular HN, Reddit, Lobsters threads for each website (10kbclub.com)
123 points by susam on Dec 28, 2020 | hide | past | favorite | 48 comments



https://john-doe.neocities.org/ is a wonder site IMO. Amazing, didn't know that much was possible without JS. Seems very good for a personal site.


Here's an image gallery with modals using just css. The css :target selector is the magic ingredient.

https://madmurphy.github.io/takefive.css/


yes


Is this technique SEO friendly?


Isn't this the exact sort of question that made the web the broken, soulless thing it is today?


The question is a symptom, not the cause. It's a reasonable question to ask.


The Google SEO +s are multiple pages with "high quality" non-repeated content, with a good reputation score, and linkbacks. Have a site all on one page can hurt SEO if you don't have any of the other criteria met.


Which search engine?


Hilariously enough both https://1mb.club/ and https://10kbclub.com/ itself are included in the listing. This means that they will eventually have to retire as the clubs grow...


Good point! https://1mb.club/ has a compressed transfer size of 7.6 KB and https://10kbclub.com/ has a transferred size of 6.1 KB right now. I did a few quick tests and found that https://1mb.club/ can take about 50 more entries and https://10kbclub.com/ can take 10 more entries without exceeding 10 KB compressed transfer size with the current page layout.


I see two possible optimizations for the 10kbclub.com index page. First, you could rely on Brotli for compression (gain of ~1300 bytes on the current page) as you seem to rely on a Chrome headless which I assume would have br in its Accept-Encoding. Second, you could try removing a few response headers which take a full 664 bytes.


Solution: don't display all entries, club rules, and membership info on the same page.


There is another https://250kb.club/


They can just paginate.


This is awesome. I have really noticed how much I like reading simple text focussed websites recently, mostly since all the consent forms everywhere.

I wrote about it:

Web design that focusses on text content is the best

https://blog.markjgsmith.com/2020/12/24/web-design-that-focu...

In the post I list what I think are some of the qualities that make a good text focussed website.

Yey for boring text focussed websites!


Reminds me some old book collections in Pirate Bay. They had all famous authors and then one or two really shitty books from some unknown never-published writer. This self-promotion technique seems to work somewhat, because one Swede eventually published and thanks to native English editors the book was not so bad anymore.


> timonoko: Reminds me some old book collections in Pirate Bay. They had all famous authors and then one or two really shitty books from some unknown never-published writer. This self-promotion technique seems to work somewhat, because one Swede eventually published and thanks to native English editors the book was not so bad anymore.

This comment sounds like an insinuation that someone or I have deceptively added my own website to this list. I apologize if that is not the case. However, let me clarify the criteria I used to select the websites. I initially took the list of websites already available from https://1mb.club/, https://512kb.club/, and https://250kb.club/, and then fed them to a script to select the websites that consume less than 10 KB compressed transfer size and that have received at least 100 upvotes on Hacker News.

Later I relaxed the rules to include upvotes on Reddit and Lobsters in the eligibility criteria. There is no self-promotion going on here. Doing so and especially insinuating that others are doing so would be taking this tiny project too seriously. Indeed there are many who rightfully argue that keeping a website under N KB (for arbitrarily chosen N) is not an interesting goal by itself. This is just a hobby project made for fun over a weekend because https://news.ycombinator.com/item?id=25176794.

Disclosure: My own website is on this list.


10kb.club only has 51 votes as of now though?

Shouldn’t it have 100? ;)


10kbclub.com was added to the list two days ago after it received 550+ points at https://www.reddit.com/r/webdev/comments/kkcoko.

See https://10kbclub.com/#club-rules for the inclusion criteria. Also click on any row of the data table to see Hacker News, Reddit, and Lobsters discussion threads about the website along with how many upvotes the thread has received. If there are more websites like this, please suggest it for addition by creating a new issue at https://github.com/susam/10kbclub.


Love this; submitted my site. The guidelines for noteworthiness help keep mostly-blank pages and other low-hanging fruit from being submitted.

Homepages minus favicons/webmanifest-icons don't need to be over 5kb without images, IMO. Personally, I'd rather put profile pics in an "about" page.

There is a lot you can do with 1kb of gzip-compressed, minified CSS.


Berkshire Hathaway could have made the cut if they didn't choose to use Google Analytics.


We need to diversify:

  100mbapp.club
  100mbram.club
  VanillaJS.club
  NoCSSFramework.club
And for physical stuff:

  AnalogElectronics.club
  AllKnobsAndButtons.club
  EasyOpenPackaging.club
  DumbElectronics.club
  NoPlastics.club
  Repairable.club
  Durable.club
  Wordofmouth.club


the no plastics club would be amazing - plastics have invaded all forms of our life. it's nearly impossible to find anything that isn't made or wrapped in plastic these days.

even in clothes, every major brand I looked at has at least some or all of it's material based from polyester.


Plastic is an incredible engineering material though. Just that it’s commoditized to the point where nothing can compete with its properties given a fixed cost. So, everything is now made of plastic.

I wish plastic was an expensive material because it really is incredible and there are thousands of different blends.


100.kb.club

200.kb.club

Etc.


10240 bytes makes it 10 KiB club, not kB.


I appreciate the attention to detail. This is something I wondered about too while deciding the transfer size limit. Finally, I decided that this website is neither 10 KiB Club nor 10 kB club but instead 10 KB Club where "KB" colloquially means 1024 bytes.

From https://en.wikipedia.org/wiki/Kilobyte,

  ---------------------------------------------------------
        Decimal       ||              Binary
  ---------------------------------------------------------
  Value |   Metric    || Value |     IEC      |    JEDEC
  ---------------------------------------------------------
  1000  | kB kilobyte || 1024  | KiB kibibyte | KB kilobyte
Also,

"In the International System of Units (SI) the prefix kilo means 1000; therefore, one kilobyte is 1000 bytes. The unit symbol is kB."

"The binary meaning of the kilobyte for 1024 bytes typically uses the symbol KB, with an uppercase letter K."

"In December 1998, the IEC addressed such multiple usages and definitions by creating prefixes such as kibi, mebi, gibi, etc., to unambiguously denote powers of 1024. Thus the kibibyte, symbol KiB, represents 210 bytes = 1024 bytes."


In my opinion you're hurting clarity. Continuing to use 1024 is to me clearly being on the wrong side of history.


Point taken and I will reconsider the choice of the unit. By the way, I got curious about how various tools deal with the KiB vs. kB vs. KB situation.

On macOS Catalina 10.15.7:

  $ head -c 92160 /dev/urandom > foo

  $ ls -l foo
  -rw-r--r--  1 susam  staff  92160 Dec 29 15:04 foo

  $ ls -lh foo
  -rw-r--r--  1 susam  staff    90K Dec 29 15:04 foo
On Debian GNU/Linux 10.6 (buster):

  $ head -c 92160 /dev/urandom > foo

  $ ls -l foo
  -rw-r--r-- 1 susam susam 92160 Dec 29 09:35 foo

  $ ls -lh foo
  -rw-r--r-- 1 susam susam 90K Dec 29 09:35 foo

  $ ls -lh --si foo
  -rw-r--r-- 1 susam susam 93k Dec 29 09:35 foo

  $ sudo cp foo /var/www/html/foo

  $ wget http://127.0.0.1/foo -O /dev/null
  --2020-12-29 09:41:05--  http://127.0.0.1/foo
  Connecting to 127.0.0.1:80... connected.
  HTTP request sent, awaiting response... 200 OK
  Length: 92160 (90K) [application/octet-stream]
  Saving to: '/dev/null'

  /dev/null           100%[===================>]  90.00K  --.-KB/s    in 0s

  2020-12-29 09:41:05 (845 MB/s) - '/dev/null' saved [92160/92160]
This 90 kB (92 KiB) is also hosted temporarily at http://172.105.48.21/foo . When Firefox downloads it shows 90 KB in the Developer Tools > Network tab as well as in the download box [1]. However, after the file is downloaded, macOS Finder shows the file to be of 92 KB [2].

[1] https://i.imgur.com/Y0q5anu.png

[2] https://i.imgur.com/muZKcZa.png


Some tools would break compatability if fixed. That doesn't mean new tooling should also be broken.

Note that on OpenBSD "df" shows space in "blocks" by default, which pretty much nobody wants.

Hard drives are sold in real units for decades.

And floppies of "1.44MB" were always 1000 times 1024 times 1.44 bytes. Madness, eh?


Peak HN pedantry. Doesn’t roll off the tongue as easy as Kb. :-)


"Shilling" rolls off the tongue better than "12 pence", but England still switched to decimal currency.

So make it 10000 bytes, then. That way it's easier to calculate how long it takes to transmit on the wire, since network speeds have always been 1000-based.

Edit: yes, I'm probably getting the pre-decimalization math wrong above, since I'm likely confusing pre-decimalization pence with post.


IIRC in pre-decimal Britain we used 'pence' a lot in a spoken names for the discrete sub-shilling coins but the names of the coins themselves all had distinct colloquial abbreviations for the amount, notably 'tuppence', 'thruppence' and 'ha'penny' for 2, 3 and half a penny respectively. 'Sixpence' - the half shilling - was pronounced as you might expect. The 3 penny coin (yes!) was colloquially called a 'thruppeny bit'. The pre-decimal abbreviation for a penny was 'd' (denari) rather than 'p' (pence) so you would never say '2p' or '5p' as we would today. Decimal quantities such as 5 or 10 pennies didn't have dedicated coins and there was no special name for these amounts.

In 1971, a cheap paperback book was sold for two shillings and sixpence, which was said as 'two-and-six'. Decimalisation [0] was my first experience of price gouging, when the same paperbacks were suddenly repriced at 15p (~three shillings) instead of the 12.5p which was the decimal equivalent of their old price.

[0] https://en.wikipedia.org/wiki/Decimal_Day


You're right this[0] is kb

0: wikipedia.org/wiki/KB_(rapper)


I see the value of these sites loading almost instantaneously on any device and connection, but other than the novelty value. I don’t see anything else.

Not digging into the actual load speed by ms or even the number of bytes loaded. Just using my human perception of loading times, I cannot see any difference between apple.com (I assume is one of the worst offenders, if not at the very least doesn’t make the 10KB club) and the sites listed in this site.


Is there any text-only reddit api wrapper? https://teddit.net/ almost makes it but it has:

* images

* some custom font (just why...)

* and little bit too bloated stylesheet

At work we have one huge react app with multiple megs of JS bundle. I want to make text-only version or similar to teddit in size for a long time but management are not keen to allocate time for this...


Not really that much interesting you can do with 10 kB it seems like.


Here's chess in 1kb of Javascript: https://nanochess.org/archive/tiny_chess_1.html (from here: https://nanochess.org/chess.html)

In fact, I just started a resource limited chess program competition: https://rlc-chess.com


This is a feature, not a bug. Sites should be made "interesting" with content rather than form, IMO.


Well, just adding one single image would exceed the 10kB page weight.


IMO, a profile pic is better for an "about" page. The 10 KB Club excludes favicons and the like from page size, since they're non-blocking and not needed to read an article's text.


All images are non-blocking, not just favicons.

If a site uses the loading=lazy attribute they won't even be loaded until they're on the screen.

Background images in in-line CSS can block rendering in some older browsers but that's really the url() bit that's blocking rather than the image.


For what it's worth, even a large 256x256 PNG favicon can fit within 1 KB to 2 KB size if properly optimized, say, using ImageOptim or another similar tool.


> For what it's worth, even a large 256x256 PNG favicon can fit within 1 KB to 2 KB size if properly optimized, say, using ImageOptim or another similar tool.

This is a heavily optimised PNG favicon [0], which is only 192x192. But without trashing the palette, I can't really deduce it further, and it's already 2.7K. Niether pngcrush nor imageoptim can do much with it.

[0] https://shatterealm.netlify.app/android-chrome-192x192.png


I tried zopflipng [1] with the very expensive option suggested from the usage and got 2,493 bytes (original 2,740 bytes, nowhere practical as it took 3 minutes) so you don't have to sacrifice the palette, though I'm not sure if 16 or even 4 color palette will significantly alter the visual.

[1] https://github.com/google/zopfli


It's already an 8bit grayscale image - significantly less than a 4 colour palette.

And whilst the server might use state of the art compression... There's still no way a much larger image is going to fit in the proposed 1-2K limit.


> if properly optimized

It's more about the content of the image, the optimization can only do so much. If it's a pixel art or the palette is reduced to hell, then yeah, probably; otherwise I don't think you can display a 256x256 photo in 2KB.


You can do many interesting things with 10kB. One of the sites is a blog, that may be interesting, depending on the content.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: