
Optimizing our website at the millisecond level - vvoyer
https://blog.algolia.com/improving-web-performance-to-mirror-engine-speed/
======
prophesi
I'd add one more section to this blog post: Disable Trackers.

It's a bit funny at how they go to such lengths to cut down their page load,
only to have it doubled with third-party analytics. Regardless, a great blog
post on best optimization practices.

~~~
Jonas_ba
Indeed, actually right now our trackers represent quite a big chunk of the
website size, but we made sure that they are all async loaded so they don't
impact the site that much. It's something that we are aware of and pushed a
bit for internally but since we are still evaluating tools and learning, we
decided to keep them for now. I'm sure that once we have the right one, we'll
decrease that part quite a bit :)

~~~
jacquesm
I've gone down this same optimization route for some of my sites and disabling
trackers was one of the biggest contributors. If you have to have trackers
consider loading them only on a small percentage of your users to get you a
statistically relevant sample. This will at least give the bulk of your users
a vastly better experience. You'll need to scale the percentage of users you
do put the trackers on with your traffic.

~~~
boreas
That's an interesting idea. Seems like it would be hard to get a
representative sample though if the trackers themselves affect the user
experience.

------
tschellenbach
Great to see them mentioning
[https://www.webpagetest.org/](https://www.webpagetest.org/) one of my
favorite tools for optimizing frontend performance.

~~~
xcasperx
Surprised they didn't mention [https://gtmetrix.com](https://gtmetrix.com).

The problem I have with lighthouse is that it emphasizes total page load,
instead of first paint.

~~~
magicalist
> _The problem I have with lighthouse is that it emphasizes total page load,
> instead of first paint._

Not sure what you mean about lighthouse? First Meaningful Paint is featured
prominently in the performance scoring.

~~~
xcasperx
You're right, a couple months ago it didn't have that

------
tabeth
Nice post, but one has to wonder if it is self defeating. In the end "speed"
is just a matter of expectation management. Suppose you had a website that
could complete all actions instantaneously. At this point I imagine speed
would simply be redefined as the amount of time it takes you to find the
desired action.

Surely there's a point where it's easier to just create a culture of patience
than to pursue diminishing returns of shaving off nanoseconds.

TLDR: Is the better experience carefully crafted expectation management or
simply being faster?

~~~
wongarsu
In terms of workflow, faster is always better than patient. Above one second
it feels like waiting on the computer, impeding workflow. Below ~200
milliseconds it feels instant.

In most applications, going from 200 milliseconds to 100 milliseconds doesn't
accomplish anything, but going from 1 second to 500 milliseconds allows for a
better workflow and more exploration (waiting is a huge discouragement to
exploration)

~~~
tabeth
I'd argue that "exploration" is a failure to identify intent for most
applications. This is probably the hardest UI/UX problem.

Would you rather select a series of instantaneous actions select the "Do"
button and within 5 seconds it will complete whatever you want, magically?

It obviously depends on how long it takes you to select the actions, but I
think it's an interesting thought experiment.

~~~
wongarsu
I would agree that most exploration is a failure to identify intent. But in
may cases, even the user would be unable to express his intent. Trying out
various actions/settings and seeing the result is a powerful tool to let the
user realize what he even wants.

