> - Less verbose than HTML / JSX, resulting in smaller file sizes
I always wondered: if you use reasonable compression (what is reasonably expected to be provided by CDNs these days? brotli? I'm so out of the loop..), if you use this, how much does file size actually matter? Isn't it about entropy, rather than size? Say, if every HTML tag required 136 <'s to open, and 136 > characters to close. How would it actually affect different compression algorithms?
My intuition says that we're all on this wild goose chase for smaller file size while it may (should) not matter at all.
For example, I took Wikipedia's page on entropy and replaced each < by 136*<, same for >. I bring you the file sizes for different algorithms:
I don't know what compression algo is de rigeur these days, and 136 <s is obviously not the same as substituting double tags for sexprs. Still, I hope we can put this file size boondoggle in the perspective of entropy one day, instead of just mindlessly chasing the character count dragon.
EDIT: turns out you can convert HTML to pseudo sexprs with some regex. here are more realistic numbers:
Agreed. I also don't see it as key feature, but listed it more in nod to point out that hiccup format is less verbose. smaller file size also means less to read for humans (although, in general that itself doesn't mean better legibility :)
It may be less verbose but it fails completely to render anything if javascript is not enabled (in the browser) on the domain the scripts are to be loaded from. Bad accessibility, bad degredation, bad practice for the web.
I'm sure it's fine for commercial JS web apps, but those are bad for the web too.
HTML is the web. All this pure es6/dom stuff is a cancer.
@danpeddle already pointed out the important part, but just as an addendum...
It all depends what one wants to do with a browser, doesn't it? For some of my projects the browser is merely a sandbox for delivering design tools and I really don't care about the HTML aspects of it at all.
For other projects, I only care about HTML generation and use hdom's sister library to generate static HTML from the same components:
If a user has JS disabled in the browser, fine. If not, then the browser app can hydrate the static HTML and add interactive features and cause cancer :)
> It may be less verbose but it fails completely to render anything if javascript is not enabled (in the browser) on the domain the scripts are to be loaded from. Bad accessibility, bad degredation, bad practice for the web.
I call BS on that. The accessibility tools that people actually use only work with "proper" browsers (counting in IE here) that all support Javascript. Modern browsers don't even officially support disabling Javascript. For most web apps, the only feasible way to degrade from "no Javascript" is to display "this page requires Javascript". Spending any more effort here because it's "good practice" is a complete waste of time. Put some effort into testing screen readers instead.
> HTML is the web.
Nonsense.
> All this pure es6/dom stuff is a cancer.
Hey, who are you to tell people what to use the web for? If somebody wants to serve as static HTML page, nobody is stopping them. Nobody is forcing people to use some Javascript framework for that.
> Modern browsers don't even officially support disabling Javascript.
That's right. The big browsers have all been co-opted (even firefox) and now target Grandma browsing Facebook and other SPA as their demographic. This is bad for the web but very good for corporations making money. These two things aren't very compatible no matter how much most web devs are invested in not realizing there's a difference.
I'm telling you how I see it and how I design my sites. None of them require javascript to function (even for my comment system). I can do this because I'm not being paid I'm just doing it for fun. I understand people being paid have to make bad websites. But that doesn't make it okay.
just wanted to point out that the joy of these approaches is that you can render on the server with the same components. the method for feeding it data obviously differs, but that's (when done right) pretty minimal.
my favourite way of demonstrating this was actually to turn js off and still see everything working. the progressive enhancement flag is still flying, and tools like this make it easier by far.
in this case, the server could be (soon!) java/clojure, which is insanely hard to do in an idiomatic, pleasant way with react components. really looking forward to getting this all wired up, and if anyone is interested we're discussing how to get hdom talking to CLJS here: https://github.com/thi-ng/umbrella/issues/36
I recall doing a benchmark of the Clang compiler that showed that parsing the source files takes much more time than any other stage of compilation.
Decompressing a larger amount of content on the client side, and then parsing it, is going to take more resources. Now, it may be a minuscule amount for one page load, but if you multiply it by hundreds of requests and thousands of users, I can see why smaller file sizes can be still be important.
For me generally too, but I was afraid of the typical counter example: Perl oneliners, size coding in general...
I think there's fine line between aiming to be lightweight and aiming for smallest whatever by sacrificing all other aspects . I'm definitely in the former camp...
I always wondered: if you use reasonable compression (what is reasonably expected to be provided by CDNs these days? brotli? I'm so out of the loop..), if you use this, how much does file size actually matter? Isn't it about entropy, rather than size? Say, if every HTML tag required 136 <'s to open, and 136 > characters to close. How would it actually affect different compression algorithms?
My intuition says that we're all on this wild goose chase for smaller file size while it may (should) not matter at all.
For example, I took Wikipedia's page on entropy and replaced each < by 136*<, same for >. I bring you the file sizes for different algorithms:
I don't know what compression algo is de rigeur these days, and 136 <s is obviously not the same as substituting double tags for sexprs. Still, I hope we can put this file size boondoggle in the perspective of entropy one day, instead of just mindlessly chasing the character count dragon.EDIT: turns out you can convert HTML to pseudo sexprs with some regex. here are more realistic numbers:
Less radical, but still following the trend.