
Never-Slow Mode a.k.a. Slightly-Fast Mode - espeed
https://github.com/slightlyoff/never_slow_mode
======
Felz
This seems a bit like a complication that browsers shouldn't be handling? Like
really, do we need this feature baked into the web forever, just to make it
easier for websites to audit their performance?

A cynical part of me notes that adding more complexity to browsers just makes
Chrome monoculture easier: I have no doubt that Google can win any war of
"throw developers at it". But it feels like they're building a complexity debt
nightmare that we'll all suffer from.

Edit: I guess the other and more direct angle for why Google would push this
is that they can give "fast" sites a ranking boost in search, i.e. dictate
what web pages do via soft power to make their job easier.

~~~
wmf
I think the real motivation is to allow Google to use guaranteed never-
slowness as an unspoofable ranking signal, essentially creating an alternative
to AMP that doesn't require special tags or libraries.

~~~
Felz
Yea, that's kind of what I meant to say, although I wasn't thinking about AMP.
Can Google really not use the vast amounts of data it ingests to deduce which
sites have bad user experiences and are slow, though? Shouldn't that be their
core competency?

~~~
wmf
Yes, Google's crawler already calculates page speed yet they've observed that
few popular sites are fast. Another aspect of AMP and NSM that I didn't
mention is that they are binary; this prevents sites from adding just a little
more bloat and then just a a little more after that until they're quite slow.

------
stefan_
This does not remotely belong in a HTTP header. DevTools, CI is more like it.

In any case, it is impossible to prove that your JS code will not somehow
exceed these limits on a rarely exercised execution path. So I recommend
turning this on in production as soon as the halting problem is solved.

~~~
hermanradtke
Isn’t it also impossible to prove that your code won’t violate CSP on a rarely
exercised execution org?

~~~
paulddraper
CSP had the report-uri that helps mitigate....and it's pretty clear how
something violates CSP.

~~~
hermanradtke
I am pretty sure I saw the same reporting behavior with this idea.

------
eridius
What's the likelihood that a mode like this will lead to sites starting to
change their design to encourage more scrolling/tapping/clicking, simply so
they get a larger resource budget? "Oh we could just show this information,
but if we put it behind a 'tap here to see more' link we'll be able to load 10
more images!"

Also, if scrolling increases the budget, doesn't that penalize sites for being
loaded in larger browser windows as the user has to scroll less?

------
danShumway
Why have a standardized definition of what a "fast" website is, when Google
can just impose and publish its own standards for Search, or even implement
its own nontransparent standards in the back end if its worried about metrics
becoming targets? I don't mean that as a sarcastic question, I don't see any
problem at all with a private approach.

This isn't really what people like me meant when we said we wanted Google to
use performance metrics in their ranking. We wanted Google to come up with its
_own_ internal metrics, not standardize them across the entire web. Because
performance isn't standard, it's extremely context-specific, and it's
important that sites like search engines and browsers be able to iterate
quickly on their metrics and to change them depending on what the ecosystem
currently looks like.

I genuinely just don't understand what value standardization has here other
than to make the system easier to game and less flexible in the future.

What positive, consumer-friendly, non-scammy things are possible for us to do
with a universally standardized set of performance criteria that we couldn't
do with privately maintained, non-standardized sets of criteria?

------
underwater
The austerity approach to page performance works for Google because they
control their whole stack, they have world class devs to can devote time to
fine tuning products, _and_ their bread and butter is in small, focused tools
(search is an input and a list).

Other big tech companies like Facebook or Netflix don't follow Google's
approach, because most products are sprawling, and built by many teams and
many engineers. Keeping a small, tight, codebase under those conditions
doesn't work.

Contrast Never Slow mode with what the React folk are doing with React Fiber.
They have rewritten their engine to allow JS execution that doesn't block the
main thread. This is much more pragmatic than setting budgets and simply
telling devs to try harder.

------
londons_explore
I think this proposal looks good.

Too many people shovel on slow JavaScript libraries and make the user
experience worse. Speed is a tradregy of the commons issue - it's time serious
action was taken to resolve it.

------
JoachimSchipper
It’s unfortunate that, under this proposal, one may end up splitting an
application into multiple iframes to get “enough” budget (or, somewhat more
nasty, an embedded iframe could decide to add a second internal iframe).

I could imagine an <iframe allow-slow-performance-shares=33%>, and leaving the
iframe to figure out how to subdivide this among itself and its children. (And
something similar for the main document.)

Deploying this will be hard, since the slowest devices also tend to run
outdated browsers (and thus won’t enforce limits anytime soon.)

------
csande17
This proposal looks pretty great; it's what AMP should have been!

------
zzo38computer
I think such featurse should be instead configurable only by the user, per
URL, with customizable thresholds, and that no reporting should be
implemented.

------
londons_explore
What about a revised proposal:

No line of JavaScript may execute more than 200 milliseconds after the last
user touch/tap.

That would still let video sites play videos, but everything else would have
to be 'ready to view' within 200ms. Let the developer decide if they want to
achieve that by making resources smaller, having fewer resources, getting rid
of js frameworks, etc.

Sites which _need_ more time could demand the user hold a finger on the screen
while they load up their heavy frameworks, but that would be a clear "we are
slow" indicator the site would want to avoid.

~~~
gioele
> No line of JavaScript may execute more than 200 milliseconds after the last
> user touch/tap.

This is how Opera Mini "implements" JavaScript. The Opera Mini server will run
whatever JavaScript code there is in the page for ~2 seconds. Whatever can be
seen on the screen at that point is sent to the Opera Mini client.

It works good enough for things like SPAs and "click here to show more".

