
How we cut our site’s load time by 87% - bubble_boi
https://medium.com/@david.gilbertson/how-we-cut-our-sites-load-time-by-87-383d2f43d6c3#.n4sihdet2
======
jamesroseman
This article mostly covers a pretty impressive load-time reduction through
caching, but I was left a bit worried by its implication. I've always been
taught that you build your app and if the load times are prohibitively
expensive, investigate where you're hanging and then maybe involve caching.

It seems like the article is just blanket recommending caching as a load-time
fix, which can introduce overhead. Anecdotal case and point, I used to run a
web app for fellow college students to get their cheapest textbooks. I saw
that the load times for the page were so long (2+ seconds) that people were
navigating off before it had even loaded. And, most of the time, they were
just trying to go back to see what they had looked up for their courses. So I
started utilizing localStorage to keep hold of loaded data from the client. It
went great... until the next semester when everyone had the old cached data. I
misappropriated localStorage and, because of it, screwed up royally.

I'm hardly advocating against caching, obviously, I just hope there's nobody
currently working on an unfinished app who implements caching far before it's
needed (and in my case, incorrectly).

> So how do I measure?

> Stopwatch. Multiple runs. Median. __

> If it gets to the point where you can’t press the stopwatch buttons quickly
> enough, it’s time to go home.

This part confused me. Isn't it somewhat trivial to use something like
`selenium-webdriver` and time the shell command to run your script that calls
it?

