
Optimizing a Static Site - NicoJuicy
https://hackernoon.com/optimizing-a-static-site-d5ab6899f249?source
======
frfl
I have a hard time understanding why it is justifiable to use 1.8mb of
javascript to produce a static website. The author's site is plain text,
stylized and opens a modal (with animation) when you click on a link. It looks
great, but I don't see why 1.8mb worth javascript is required to produce this
(of which there a good chance <5% is actual code for the site's
functionality/styling).

When it comes down to it, what you should be optimizing for is efficiency of
delivering information to your users. Animations don't help there in cases
like this. And I doubt the javascript is helping either.

EDIT: 1.8mb figure is from the bundle.js file from the author's live website

~~~
jonluca
The reason why there was such a small effort on optimization in the first
place is that this isn't meant to be a revenue generating site, or one that
focuses on "ultimate performance" \- it's more of just a personal, online
resume. Any time I needed something new, I'd just throw it in, because "it
worked and looked good on my machine". If I found a cool js snippet that
needed an entire library, I'd just throw it in. An argument could be made that
that's part of what's wrong with modern webdev, and I wouldn't necessarily
disagree, but this wasn't trying to be the fastest or most efficient site out
there. I think there's a pretty distinct difference between making a scalable,
well optimized webapp and just making a personal site as an online resume.

This was me documenting my updates - I just revised the article with a few
more updates, and the total site size is 250kb, 99kb without images. This
still isn't great, but total page speed is pretty quick (js execution takes
~70ms on my 2016 MBP).

Thanks for the feedback!

~~~
frfl
I know JavaScript is cool and I'm not saying it's not a good idea to try
things out and experiment -- it is! But I personally feel an individuals
resume/personal site isn't what should stand out. Make it a true static site,
lose the bloated JavaScript, add some basic styling and it will do what this
site is doing -- maybe do a little more as people will not have to click on
things, wait, read, close, click, read, etc. Let your projects speak for your
skill set, not a fancy resume.

All the best.

------
iUsedToCode
"However, it’s only 829kb, and that includes every single non-image asset
(fonts, css, all libraries and dependencies, and js)."

Only 829KB? ONLY?

This[1] is a (dynamic) on-line course for all beginner sport shooters in
Poland, each page takes about 15KB. And i still sometimes want to rewrite the
CSS, as it's now 3 requests. It shouldn't make a difference on HTTP/2, but
that's a lame excuse.

It's not very fast (100-200ms) and maybe i should get a faster server --
there's room for improvement. But in my view, if it's static, it's supposed to
be as fast as a native app on your smartphone. 1MB of bullshit js that nobody
wants is not "ONLY".

Btw, maybe mine is lame and ugly. But i couldn't sleep if i made people load
1MB of fonts, javascript and CSS. It's just a shameful waste.

[1] [https://patentstrzelecki.eu/](https://patentstrzelecki.eu/) (polish)

~~~
sbinthree
Agreed. I tried taking this to the extreme with my personal site. It's down to
~30ms on free Netlify for first and subsequent loads, always <10kb including
content, with one HTTPS request per page load. No external dependencies,
reasonably readable, 100% passing in Lighthouse. The highest leverage things
are usually to delete anything font related, reduce or remove media and
GZIP/minify. More obscure approaches include base64ing a fake favicon to get
rid of the extra call, eliminate media, eliminate all requests but the core
HTML static resource, inline everything, move to a faster host with lots of
CDN edges, test aggressively with slow connections and eliminate until <100ms.
Mostly because it's fun, also useful though if you serve a user base in
developing world.

~~~
davewasthere
I also moved to netlify (from a gh-pages/cloudflare setup). And while I'm a
huge fan, and loved the process, I am interested that google crawler seems to
have found it a lot slower.

[https://i.imgur.com/RUssvAA.png](https://i.imgur.com/RUssvAA.png)

I've SSL enabled (as well as HSTS) And yes, the initial redirect seems to take
a while. But equally the very first page request from a site is actually quite
slow I feel. Considering it's a static page of less than 4kB, I'd expect it to
be a bit snappier...

I've other reasons for trying netlify, and I'm not too bothered about the
slowdown (my site has bugger all traffic tbh), but I would like to get to the
bottom of it.

------
NightlyDev
I'm ashamed to be calling myself a web developer... What's up with everyone
including tons of code from others, code they don't even know.

"Wooah, we need to toggle a class here!" "We got jQuery for that!"

"We also need to.." "Oh, we got this for that!"

Take this page as an example. It freaking huge, and the writer seems to think
he has done a good job optimizing it now.

The page is really simple, and still a 1.66 MB JS bundle is required to even
begin rendering the page.

And still the page doesn't work properly if you just do something as simple as
view one external link and press the back button in an attempt to get back.

I know of a JS framework that might fix it, and it's only 7.6 MB. /s

Seriously, people should write about stuff they know.

~~~
jonluca
Thanks for the reply. I agree that 1.66mb is insane for what my site does, and
I'm working on minifying it as much as possible. I got it down to ~90kb, which
is still not amazing, but is significantly better than 1.6mb. Fonts were
taking up more than 90% of that, which I just stripped.

I do disagree with the idea of including code from others being a bad
practice/smell. The detriment and added load time of using a library like
jQuery is well worth the functionality it provides, IMO.

Thanks for the comment and for pointing out a problem with the article, though
- I've revised it!

~~~
tinymollusk
Just a quick vote to reinforce what you did. You made something, and asked for
the most absurd audience to critique it.

Keep being you. HN hated Dropbox, of all things. I am hesitant to announce my
product here because I struggle with as widespread of negativity as I perceive
here.

If someone isn't going to buy what you're selling, I feel they should be
silent and just not buy. If they would buy with a small tweak, there are ways
for Bad PMs to put up their credit card. Otherwise, ignore their advice,
ignore mine, and keep pushing.

For the people who ask for more, yeah. There's always room for improvement.
But if you take down people within our community, you can go sit and spin.

------
atonse
For a static personal site like this one, I’d recommend using caddy, which
would’ve done most of these things out of the box.

~~~
dotsh
nginx + ngx_pagespeed module also.

------
mlok
Talking about site speed comparisons, in order not to have results dependant
on one's internet connection speed, would Firefox Dev Tool > "Responsive Mode"
> throttling tool be a reliable way to benchmark sites ?

Anyone using this ?

(For example, test all sites on "regular 2g network" simulation and compare
these times)

~~~
QasimK
That's pretty much what it's designed for.

I think I remember reading about some esoteric differences with real life -
related to dropped packets vs merely rate limiting the connection with HTTP2.

------
pepe56
How would you actually strip unneeded javascript anyways? Let's imagine I have
a setup like the OP and want to optimize it. Is there something, that would be
able to strip down all JS functions, that are not needed by the site?

~~~
jonluca
Webpack does dead code elimination through UglifyJS, see here
[https://webpack.js.org/guides/tree-
shaking/](https://webpack.js.org/guides/tree-shaking/)

------
uluyol
I've found link preload headers to be pretty effective at reducing the number
of round-trips necessary to load my website.

