The main CSS file they're using is https://www.w3.org/2008/site/css/minimum. If you download this with `curl 'https://www.w3.org/2008/site/css/minimum' -H 'Accept-Encoding: gzip, deflate, br'`, you see that it starts with 03 D7 86 1F 8B 08 08 and ends with 25 10 91 4D 7B 30 00 00 03. This is a brotli metablock header (03 D7 86: length = 3503 bytes, uncompressed), followed by a gzip header (1F 8B 08 08: signature, compression=deflate, filename present), and ends with a gzip trailer (25 10 91 4D 7B 30 00 00: crc32, size of file) and finally an empty brotli metablock header (03).
So, what's happening is that they're serving gzip files from their server (which is hinted by the "content-location: minimum.css.gz" response header), which are being compressed again using brotli somewhere else (e.g. at a reverse proxy).
My guess is they've been serving their css file direct from disk with no content encoding header, but browsers rolled their eyes and decoded it anyway (do they do that?). But now it's changed to serve behind a reverse proxy which isn't so forgiving and recompresses it with brotli (does it even reduce the size?) and that's too much for the browser to infer implicitly. Fin.
Worth noting: if you're not seeing the Brotli header, it's probably because something (e.g. your browser) is transparently decoding the declared Content-Encoding (which is `br` for Brotli). That'll yield raw gzip data. In this case, your browser or other user agent has already applied Content-Encoding, so they aren't going to do it again.
Interestingly, at least with Firefox on Linux it seems to be decompressing the .css files ok... but borking on the javascript file that has the same incorrect content encoding header.
Responsive? I have to scroll past the left column ("Site Navigation") to read anything. That's the opposite of responsive (e.g. 2 columns on wider screens, 1 in mobile).
There's ... something ... to be said for a site with minimal markup which works in text-mode browsers.
Though direct usage of those is probably a sliver of a fraction of all online usage, this also means that automated tools can work with a website. Scraping has both dark and light patterns and use-cases.
I'd hardly call it a trend because It's been a thing for as long as I've used the web; but some 'classic' style programmers seem to opt out of CSS for their personal sites. One such example is Dan Luu [0], who wrote in a post [1] that styling decreased his viewership.
Additionally, it can emphasize good structure that was concealed with CSS
By the way, has anyone used the eaf-browser? I use the other eaf applications like PDF viewer, but for some reason, the eaf-browser seems to eat up a lot of resources.
That's because eaf-browser just uses chrome in a docker container, does screencaps -> ocr, then runs spellcheck, plus chatgpt on top for sanity checks, then copies it all back from an aws instance.
As always when styling comes up there's always a few people on HN who claim to prefer "no CSS" because they "always" use Lynx, or deeply customize all content on their own from scratch, or are using a 1G flip phone, or care about milliseconds of loading time or they're RMS (oh, wait, stallman.css exists!).
Sure, some idiosyncratic blogs can get away with that css-free "look" (eg https://danluu.com/). The W3? NO? but now I guess so!
But with a couple of keystrokes that are in my muscle memory I can narrow my browser window to any reading width I want. So there’s no problem, and the site is easier to read than almost all modern websites.
As someone with a visual impairment, when I open that website I see someone that wants to put weird internet idealism above not giving me eye strain. This attitude is a bit of oldschool internet culture that I’m glad to see die off.
Originally it was the job of browsers to provide good readable non-eye straining, user controllable default styles. In the browser wars it was decided few websites "wanted" good defaults, they'd rather "no defaults" so that every website could be styled as its own special snowflake (to meet brand goals or design team whims) and so default styles became a lowest common denominator trapped in the 90s.
Eventually browsers rediscovered that need to provide good defaults and now call it "Reading Mode" and force users to opt in to that behavior on a page by page basis.
Something lately I've been wondering is if there should be a way to signal to a browser "this is an intentionally minimal HTML page, please style it by default as if the user chose Reading Mode".
It's a testament to good structure that the site is legible without its styling.
... legible, not good. If I had to read documentation that looked like that all day I'd consider a career change (or, perhaps, building an infrastructure to improve the legibility of web pages...).
* Focus indicators
* Text block width
* Target Sizes
Apart from that, even if it were there there's some trouble with the markup:
* <abbr> tags missing
* No landmark elements
* bloody tabindex
* repeated same div
* search input has no visible label
* no context-sensitive help or whatnot
Would've been the last thing WWW Inc. has influence over on the web.
And even that has been a joke (of the consider-your-carreer-choices variety) from the beginning, while today it's just annoying holding on for job security.
At least their complaining about being financially dependent on Google and being part of the web standard circus preventing real independent standardization makes for an entertaining read lately [1] if you had any doubt Google are the ones to call the shots.
I like it "better this way" conceptually, in that I like browsing with w3m/lynx/links/eww/etc but the content structure is bad "this way" (ex: scrolling past site nav links too much before I reach actual content; most of the vertical nav should and would be horizontal if this weren't a bug; etc).
So, what's happening is that they're serving gzip files from their server (which is hinted by the "content-location: minimum.css.gz" response header), which are being compressed again using brotli somewhere else (e.g. at a reverse proxy).