Hacker News new | comments | show | ask | jobs | submit login
Show HN: PerfAudit – Performance audit of caniuse.com (perfaudit.com)
38 points by chinchang 717 days ago | hide | past | web | 12 comments | favorite



Cool to see someone else doing this.

If I was doing this myself, I'd suggest working on js.php. It's taking over 1s to get that single file, which appears [1] cacheable. An nginx reverse proxy or something comparable would make this way faster. If there are uncacheable parts of this file, then those could be separated out.

It is interesting that the script is loaded async with "defer", so the page is functional (but not ajax-y) before it's loaded. So the long load time isn't quite so bad. But I'd be interested to hear why the whole file isn't just cached.

Also, for a single-page website like this, where all navigations are loaded with AJAX, I'd suggest inlining all the CSS and JS.

Finally, I'd think about making the entire page itself cached. There doesn't appear to be any sort of login, so it's probably feasible. But I'm not sure how the Settings stuff is implemented; if that's persisted server-side then caching the page wouldn't work.

[1] The Cache-Control header specifies a max-age but not public/private. I honestly don't even know what semantics this has - but it's probably best to make it explicit.


@strommen, [1] agree with making static resources cacheable and even using CDN for the same.

[2] inlining all CSS and JS, might not be a good idea, as it has several issues related to the same e.g. not being cacheable to start with, parallel resource request etc.

There are several other optimizations possible to reduce the page load time that we could share, though the issue in hand which required immediate attention and seemed to be causing a serious damage to the user's experience was its rendering performance, which we have focused in our discussion.


You could also try http://perfmonkey.com for automating these perf audits. If uses the same tools used by Chromium for finding perf telemetry - http://axemclion.github.com/browser-perf.

Here is a run for canIuse using the automated tool - http://www.perfmonkey.com/#/trynow/results/travis/53663113


Wanted to get an example on one of my projects. Sadly it requires write access to my public repositaries. Sorry - no.


Liked the concept of performance auditing websites, are you guys working as consultants?


Right now we are just looking for awesome websites to audit as case studies and share with developers. Open to audit requests.


We'd love an audit of https://sourcegraph.com! :)


You could try using http://perfmonkey.com to run a simple scroll test on your webpage. If you are looking at more powerful tests, try out http://github.com/axemclion/browser-perf.

Here is the result I got - http://www.perfmonkey.com/#/trynow/results/travis/53664194

Here is a test I ran on bootstrap CSS library over all its versions - http://axemclion.github.io/bootstrap-perf


Cool, thanks! We get a thumbs down for mean frame time. It'd be great to know where we fit (%ile). Do you think it's because we use Bootstrap and SVGs?


This is what we are trying to solve. Using automated tool you know that "your site is painting slow or loading slow". But "Why" is something that is still missing in such tools.


Not as detailled as a manual one, but here is a good start : https://www.dareboost.com/en/report/550828b4e4b074fd5732f006


Thanks @sqs, we'll revert back to you soon with #perfaudit of sourcegraph.com




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: