
Passive event listeners - franze
https://github.com/WICG/EventListenerOptions/blob/gh-pages/explainer.md
======
Jonas_ba
While passive event listeners are an improvement, they are not a complete
solution. It's more like the "least of evils". There is a new set of
primitives that is being implemented called IntersectionObserver which solves
most of the issues of having to detect when something is in view + they run on
a separate thread. So you don't need to do any computations, reading the
window size and whatnot magic to detect if an element is inside the window. We
are currently using this in prod for our algolia static websites and it has
worked very well for the majority of all of our use cases.
[https://github.com/WICG/IntersectionObserver/blob/gh-
pages/e...](https://github.com/WICG/IntersectionObserver/blob/gh-
pages/explainer.md)

~~~
rjbwork
Why do you need to detect whether something is in the viewport?

~~~
SimeVidas
For lazyloading stuff like images and ads. Or that technique where you have an
infinite list, and the elements that are scrolled off-screen are recycled on
the other end.

~~~
esailija
You can easily scroll so that the intersection observer sentinels are not seen
and cause bugs. Disappointing.

------
samwillis
You can achieve smooth scrolling animations by using requestAnimationFrame and
watching the scroll position within it, bypassing the blocking nature of the
scroll event handler.

~~~
paulirish
`scroll` listeners are not blocking, however `wheel` ones are. If you've tried
to implement a parallax effect in JS you may have noticed the subtle
differences on timing between these two (wheel fires before the scroll
position is updated, scroll afterwards). That said, yielding immediately
within a listener (like you recommend) is indeed a best practice.

------
twiss
> Many developers are surprised to learn that simply adding an empty touch
> handler to their document can have a significant negative impact on scroll
> performance.

Maybe browsers could do some static analysis for the simple cases?

Or, would it be possible to fire all events asynchronously, and if the event
cancels scrolling, revert the scroll position to where it was when the event
happened? (And possibly mark the event handler as canceling scrolling.) That
could sometimes cause jankiness when scrolling is canceled, in exchange for
smooth scrolling in other cases.

~~~
syberspace
how's this for an idea: maybe developers could write code that isn't shitty

~~~
Piskvorrr
Why didn't anyone think of that before?! I mean, what a breakthrough - think
of the time that could be spent elsewhere, _if only everyone got everything
perfectly right_. Coming up with the idea was the hardest part, implementing
it should be easy, right?

In other words, "We won't allocate any time for testing and debugging: just
don't write any bugs into the code" -a PHB I used to know

(Do you think people write shitty code _on purpose_?)

~~~
syberspace
I don't think I said people write shitty code on purpose, I'll have to go back
and read my own comment to make sure though.

You might find that it is even a better option to do _more_ testing and
debugging as opposed to cramming yet another pointless feature into already
bloated browsers. And as a sibling to my previous comment mentioned: this is
ripe with "unpredictability and interoperability issues"

Maybe create a `static js analysis` browser extension for developers to aide
them in their endeavours to stop writing shitty code, but don't force that
down everybody's throat.

~~~
connor4312
I'd hardly call this feature pointless. Scroll events, with cancellation
support, have been a thing for well over a decade. This solves a very real
problem of having to execute single-threaded application code that blocks
scrolling. Particularly on mobile where resources are constrained and
scrolling is smooth, this is a big problem.

Having developers hint to the browser that the event will not be cancelled
solves it very simply. Far more simply and predictably than any kind of static
analysis could.

~~~
syberspace
You seem to be misunderstanding my point. I'm definitely all for passive
listeners. But I am very much against putting more workarounds for shitty
developers directly into the browser.

~~~
onion2k
The change we're discussing here is something that the developer will need to
explicitly add to their code;

 _By marking a touch or wheel listener as passive, the developer is promising
the handler won 't call preventDefault to disable scrolling._

Shitty developers who are lazy aren't likely to add that to their code if it
'works' without it. They are shitty after all. They won't care about a janky
scroll on their webpage. Heck, they probably won't even test different ways of
scrolling. Therefore it's fair to think this change is for _awesome_
developers who test things and find that there's a problem, and need a way to
fix it.

~~~
syberspace
yes, passive listeners are great. What I don't want in my browser is a static
analyzing tool as the first comment I answered to suggested

~~~
Piskvorrr
Well...that's a bummer, as modern JS virtual machines used in browsers all use
some sort of predictive code path model, and such static analysis is an
integral part of this.

------
yoshuaw
The intercept is a fun example of where Passive Listeners would help. If you
check out this article on mobile, you'll see the header jump around; passive
listeners would fix that [https://theintercept.com/2017/08/15/fearful-
villagers-see-th...](https://theintercept.com/2017/08/15/fearful-villagers-
see-the-u-s-using-afghanistan-as-a-playground-for-their-weapons/)

~~~
whipoodle
Hmm, I couldn't get that to happen. Seems fine to me.

~~~
yoshuaw
Hah, maybe you have a better device than I do. If you toggle log level
"verbose" on in the chrome console you'll see a warning being emitted
specifically about the scroll listener not being passive.

------
iainmerrick
It seems like the root of this entire problem is that Javascript is single-
threaded, but current browsers want to be multithreaded (and specifically to
do rendering and scrolling in a separate thread).

Does JS need to be single-threaded? I wonder if browsers could speculatively
run JS event handlers in multiple threads, then block and transfer to a
central "main" thread iff they mutate any global state.

~~~
robocat
The issue is that the browser is waiting for a return value from a called
event function. The browser changes its behaviour depending upon the return
value. Singlethreading versus multithreading is irrelevant.

~~~
iainmerrick
It's not irrelevant, it's right at the heart of the problem!

Modern browsers use separate threads for Javascript and rendering. But if you
have a (non-passive) scroll event, the rendering thread has to block on the
Javascript thread, and that's why scrolling is janky.

------
weeksie
Shouldn't calling `preventDefault` on an event marked passive throw an
exception? It seems sloppy not to but maybe I'm missing something.

~~~
ChrisSD
It should be logged as a warning but I don't think it should throw. Throwing
obviously stops execution which is unlikely what the programmer intended.
Javascript is, for better or worse, a permissive language. It doesn't like to
break code when it doesn't have to.

~~~
gpvos
It's truly the Visual Basic of our times. On Error Resume Next...

------
josephwegner
> For instance, in Chrome for Android 80% of the touch events that block
> scrolling never actually prevent it

I'm curious how they get at this data. I haven't fully thought it through, but
something about this raises the "creepy flag" in my head.

Does this data come through the "Automatically send usage statistics and crash
reports to Google" setting? Google's help page [1] regarding that doesn't seem
to indicate it's recording information about website's javascript.

[1]:
[https://support.google.com/a/answer/151135?hl=en](https://support.google.com/a/answer/151135?hl=en)

EDIT: I'd also be curious if they have done any analysis to see if the
mechanism that gathers this data is related to the performance implications of
touchEvent listeners.

~~~
twiss
They measure lots of client-side feature usage, in part so that they know when
they can deprecate something. Here's a list and short explanation [0].

Firefox has something similar [1] as part of Telemetry, although they measure
less things [2].

[0]:
[https://chromium.googlesource.com/chromium/blink/+/master/So...](https://chromium.googlesource.com/chromium/blink/+/master/Source/core/frame/UseCounter.h#47)

[1]:
[http://gecko.readthedocs.io/en/latest/toolkit/components/tel...](http://gecko.readthedocs.io/en/latest/toolkit/components/telemetry/telemetry/collection/use-
counters.html)

[2]: [https://dxr.mozilla.org/mozilla-beta/source/obj-x86_64-pc-
li...](https://dxr.mozilla.org/mozilla-beta/source/obj-x86_64-pc-linux-
gnu/dom/base/UseCounterList.h)

------
TekMol
Do I understand it correctly, to cause a problem, a page needs to:

1) Add a touch handler to the window.document element 2) Do computations even
when there is no user interaction

So when the user wants to scroll, the touch handler cannot be run because some
other javascript is already running. So the browser waits until the other
javascript is finished, executes the touch handler and only then scrolls.

Is that correct?

If so, I am not in much danger. Because I usually do not do a lot of
computation without user interaction. And I also do not put a touch handler on
the document element.

~~~
ensiferum
The problem is that the client side browser side implementation which has it's
own display thread (It displays prerasterized tiles) must post the scroll
event to the rendering thread which runs the JavaScript engine and then wait
for the response before it can show the result of the scroll. With this flag
the client side can respond to the scroll request immediately and display
already rasterized tiles without waiting for the js thread.

------
solarkraft
I'm thinking that maybe event listeners should be passive by default.

~~~
fiatjaf
Yes, but that would break the internet.

~~~
Fifer82
I know this is a silly question and probably argued already over the years.

But can't and shouldn't this be solved by some mechanism? Like, some umbrella
"native-web-1" meaning that any project without that tag renders 2020 and
below whilst anything with it as a brand new shiny implementation.

It would also help prevent the relentless "wow I really like that 2024
feature, lets make a shit version now today and another 7 over the years so
that everyone is stuck with a dependency which doesn't even follow the
specification" (promises etc)

~~~
fiatjaf
That's a good question. That <!DOCTYPE> things is that such mechanism, isn't
it? (Or it was supposed to be.)

Perhaps the problem is that shipping, for example, two different Javascript
engines would be too bad for browsers?

I want to know.

~~~
twiss
The WHATWG's philosophy for the HTML standard is never to sway too far from
what browsers currently do, mainly to make sure that the browsers can keep up:
[https://whatwg.org/faq?#living-standard](https://whatwg.org/faq?#living-
standard)

The browsers seem to like that, so it's probably unlikely that we'll get a
wildly different HTML6 anytime soon, if ever.

------
ezequiel-garzon
Is WICG a new kind of WHATWG? Does anybody know the difference between them?

~~~
Ajedi32
The WICG is a working group for W3C. Here's their charter:
[https://wicg.github.io/admin/charter.html](https://wicg.github.io/admin/charter.html)

------
totony
how does having a function(){setTimeout(handler,1)} compare to this?

~~~
dkh99
That function won't block the rendering operation. Having a lot of
timeouts/etc. will potentially slow the JS down in general (indirectly slowing
the page down or costing battery, CPU, etc., but it will not cause the user-
action jank that the article is talking about.

------
rvanmil
And while they're at it, please remove all api's which allow for
scrolljacking.

