If I was doing this myself, I'd suggest working on js.php. It's taking over 1s to get that single file, which appears [1] cacheable. An nginx reverse proxy or something comparable would make this way faster. If there are uncacheable parts of this file, then those could be separated out.
It is interesting that the script is loaded async with "defer", so the page is functional (but not ajax-y) before it's loaded. So the long load time isn't quite so bad. But I'd be interested to hear why the whole file isn't just cached.
Also, for a single-page website like this, where all navigations are loaded with AJAX, I'd suggest inlining all the CSS and JS.
Finally, I'd think about making the entire page itself cached. There doesn't appear to be any sort of login, so it's probably feasible. But I'm not sure how the Settings stuff is implemented; if that's persisted server-side then caching the page wouldn't work.
[1] The Cache-Control header specifies a max-age but not public/private. I honestly don't even know what semantics this has - but it's probably best to make it explicit.
@strommen, [1] agree with making static resources cacheable and even using CDN for the same.
[2] inlining all CSS and JS, might not be a good idea, as it has several issues related to the same e.g. not being cacheable to start with, parallel resource request etc.
There are several other optimizations possible to reduce the page load time that we could share, though the issue in hand which required immediate attention and seemed to be causing a serious damage to the user's experience was its rendering performance, which we have focused in our discussion.
Cool, thanks! We get a thumbs down for mean frame time. It'd be great to know where we fit (%ile). Do you think it's because we use Bootstrap and SVGs?
This is what we are trying to solve. Using automated tool you know that "your site is painting slow or loading slow". But "Why" is something that is still missing in such tools.
If I was doing this myself, I'd suggest working on js.php. It's taking over 1s to get that single file, which appears [1] cacheable. An nginx reverse proxy or something comparable would make this way faster. If there are uncacheable parts of this file, then those could be separated out.
It is interesting that the script is loaded async with "defer", so the page is functional (but not ajax-y) before it's loaded. So the long load time isn't quite so bad. But I'd be interested to hear why the whole file isn't just cached.
Also, for a single-page website like this, where all navigations are loaded with AJAX, I'd suggest inlining all the CSS and JS.
Finally, I'd think about making the entire page itself cached. There doesn't appear to be any sort of login, so it's probably feasible. But I'm not sure how the Settings stuff is implemented; if that's persisted server-side then caching the page wouldn't work.
[1] The Cache-Control header specifies a max-age but not public/private. I honestly don't even know what semantics this has - but it's probably best to make it explicit.