

Show HN: PerfAudit – Performance audit of caniuse.com - chinchang
http://perfaudit.com/case-study/caniuse.com/

======
strommen
Cool to see someone else doing this.

If I was doing this myself, I'd suggest working on js.php. It's taking over 1s
to get that single file, which appears [1] cacheable. An nginx reverse proxy
or something comparable would make this way faster. If there are uncacheable
parts of this file, then those could be separated out.

It is interesting that the script is loaded async with "defer", so the page is
functional (but not ajax-y) before it's loaded. So the long load time isn't
quite so bad. But I'd be interested to hear why the whole file isn't just
cached.

Also, for a single-page website like this, where all navigations are loaded
with AJAX, I'd suggest inlining all the CSS and JS.

Finally, I'd think about making the entire page itself cached. There doesn't
appear to be any sort of login, so it's probably feasible. But I'm not sure
how the Settings stuff is implemented; if that's persisted server-side then
caching the page wouldn't work.

[1] The Cache-Control header specifies a max-age but not public/private. I
honestly don't even know what semantics this has - but it's probably best to
make it explicit.

~~~
apoorvsaxena
@strommen, [1] agree with making static resources cacheable and even using CDN
for the same.

[2] inlining all CSS and JS, might not be a good idea, as it has several
issues related to the same e.g. not being cacheable to start with, parallel
resource request etc.

There are several other optimizations possible to reduce the page load time
that we could share, though the issue in hand which required immediate
attention and seemed to be causing a serious damage to the user's experience
was its rendering performance, which we have focused in our discussion.

------
axemclion
You could also try [http://perfmonkey.com](http://perfmonkey.com) for
automating these perf audits. If uses the same tools used by Chromium for
finding perf telemetry - [http://axemclion.github.com/browser-
perf](http://axemclion.github.com/browser-perf).

Here is a run for canIuse using the automated tool -
[http://www.perfmonkey.com/#/trynow/results/travis/53663113](http://www.perfmonkey.com/#/trynow/results/travis/53663113)

~~~
laacz
Wanted to get an example on one of my projects. Sadly it requires write access
to my public repositaries. Sorry - no.

------
roothacker
Liked the concept of performance auditing websites, are you guys working as
consultants?

~~~
chinchang
Right now we are just looking for awesome websites to audit as case studies
and share with developers. Open to audit requests.

~~~
sqs
We'd love an audit of [https://sourcegraph.com](https://sourcegraph.com)! :)

~~~
axemclion
You could try using [http://perfmonkey.com](http://perfmonkey.com) to run a
simple scroll test on your webpage. If you are looking at more powerful tests,
try out [http://github.com/axemclion/browser-
perf](http://github.com/axemclion/browser-perf).

Here is the result I got -
[http://www.perfmonkey.com/#/trynow/results/travis/53664194](http://www.perfmonkey.com/#/trynow/results/travis/53664194)

Here is a test I ran on bootstrap CSS library over all its versions -
[http://axemclion.github.io/bootstrap-
perf](http://axemclion.github.io/bootstrap-perf)

~~~
sqs
Cool, thanks! We get a thumbs down for mean frame time. It'd be great to know
where we fit (%ile). Do you think it's because we use Bootstrap and SVGs?

~~~
chinchang
This is what we are trying to solve. Using automated tool you know that "your
site is painting slow or loading slow". But "Why" is something that is still
missing in such tools.

