
The specification for native image lazy-loading is merged into HTML standard - saranshk
https://github.com/whatwg/html/pull/3752#issuecomment-585202516
======
buro9
If you are going to use this, keep in mind the advice given... set the height
and width attributes of the image to prevent the page jumping around for
lazily loaded images.

Which means, if you accept user generated content or images from any CMS or
other source you really should be obtaining the image dimensions and using
them in the HTML.

~~~
martin_a
How/Why would I do this with responsive/fluid layouts and dynamically changing
image sizes?

~~~
martin-adams
You would attempt to set some constraints on the image dimensions at the
different breakpoints. For example, you can use percentages if you want a
fluid amount. It's really to tell the browser how much space to allocate to
the image before it's loaded. This is an old problem and pages that jump
around would happen on slower connections.

~~~
martin_a
Ok, my image has 100% width for screens up to 800 px width, what height value
do I have to set? You simply don't know this information at the time when the
DOM is built.

The idea to set fixed dimensions feels like it comes from a time where we did
designs with tables and "optimized" sites for 1024 x 768 pixel screen size.

That was the time when you could tell that this image should be displayed 200
x 200 px in size and nothing else. And it fit. One way or another.

~~~
martin-adams
I would approach this thinking in terms of a known aspect ratio of the images
and use a technique like this:

[https://www.w3schools.com/howto/tryit.asp?filename=tryhow_cs...](https://www.w3schools.com/howto/tryit.asp?filename=tryhow_css_aspect_ratio_43)

Make no mistake, this is a silly problem to need to hack a solution for. So I
would welcome lazy loaded images with space allocation before it's rendered
with a responsive sizing.

------
amanzi
Lazy loading can be frustrating when the connection between the website and
your browser is slow. I come across this from time to time and the result is
that every time you scroll down, you have to wait ages for the next set of
images to load.

~~~
compumike
This is mostly solvable. I recently deployed lazy loading for schematic images
on
[https://ultimateelectronicsbook.com/](https://ultimateelectronicsbook.com/)
and configured IntersectionObserver (with polyfill) to have a rootMargin equal
to 3*window.innerHeight (see [https://developer.mozilla.org/en-
US/docs/Web/API/Intersectio...](https://developer.mozilla.org/en-
US/docs/Web/API/Intersection_Observer_API)). Under most reading and even fast-
scrolling conditions, the image will be loaded well before it is scrolled into
view.

But I will agree with the sentiment -- I find it incredibly annoying on pages
that do lazy loading and don't implement this enlarged intersection check to
make it appear seamless.

~~~
greggman3
Anyone know any books that go the opposite order? Start with interesting
working projects, add in the theory as needed to adjust the projects.

To learn photography I'd prefer starting with a camera. Not withe theory of
light or the properties of atoms that allow glass to be transparent or how
lenses are ground.

Similarly, if possible, to learn electronics i'd like to start at a higher
level and work down rather than bottom up.

~~~
rbinv
Interesting, as I'm exactly the opposite.

Are there names for these two types of teaching/learning approaches? Is one
"better" than the other?

~~~
quietbritishjim
"Bottom-up" vs "top-down" is the general name for the two approaches, although
they're not specific to learning (also e.g. to project planning).

Edit: Actually I'm not sure that's really right, even example-led approaches
are usually bottom up. For example, an example-led approach to programming
would start with "hello world" and work up from there, leaving a full-blown
example project (if there is one) to the end.

~~~
sippeangelo
Completely non-ironically, I prefer a middle-out approach. I like to dive in
past the "beginner" stage immediately, and go from there. If I feel like I
need to go back and learn the basics, I can do so on a need-to-know basis.

------
Waterluvian
Now that this is in the spec we just need browsers to implement it and then I
can make a "shaking your phone loads things faster" plugin. People will notice
a slight difference just often enough where eager loading worked out better.
They'll waggle their phones all the time. It's going to look hilarious.

~~~
adventured
A fitting tribute to shaking your mouse in olden times to speed up page
loading.

~~~
nojvek
Yeah back in the day UI was very single threaded. Moving the mouse faster
meant more processing time for UI since it had priority and faster the
experience.

~~~
buboard
[https://www.extremetech.com/computing/294907-why-moving-
the-...](https://www.extremetech.com/computing/294907-why-moving-the-mouse-in-
windows-95-made-the-os-faster)

------
Wowfunhappy
This is exciting, because the sooner developers standardize on single method
of lazy-loading, the sooner I'll be able to disable it with a userscript.

~~~
stkdump
Except expect polyfills!

~~~
naniwaduni
Lazy loading seems like something that doesn't need a polyfill--the fallback
behaviour is that it just works correctly...

(This is not to say not to _expect_ the polyfills, though)

~~~
stkdump
If lazy load isn't needed, then there is no need to apply it in the first
place.

~~~
Wowfunhappy
But a lazy load is never truly _needed_ , right? It's a way to save bandwidth.

~~~
stkdump
Well, if saving bandwidth is valid a concern, then why isn't it also true for
the polyfill?

------
goblin89
I wish browser vendors, instead of native lazy loading for images, focused on
a universal mechanism to lazily render arbitrary elements, including custom
ones.

Browsers could profile how long rendering a particular type of element takes
on a given website, and optimize render triggers to provide seamless
experience while conserving bandwidth.

Currently lazily rendering custom elements requires a fair chunk of
IntersectionObserver boilerplate code, and beyond that any adaptability to
user connection seems too complex to even consider.

~~~
andy_ppp
I think you are conflating React elements (a blob of JS) and HTML elements (er
the DOM?) in the comment to the point where I find it very difficult to reason
about what an answer would look like.

If you replace your use of element with "blob of javascript" how would the
browser separate out the "element" blobs that are "slow" from the other
Javascript?

~~~
goblin89
I mean purely custom elements, no React at all (which should not be of concern
to browser vendors anyway). The metrics that would matter are similar to what
we see in browser developer’s tools profilers—ones that measure the
approximate time from when script execution starts until the document is
rendered, or others depending on how custom element is bundled.

This is with a degree of imprecision overall, of course, but similar
approaches I imagine could be used to profile individual element rendering
times.

Edit: Tried to clarify my idea, being away from any browser developer tools at
the moment myself.

------
toastal
[https://bugzilla.mozilla.org/show_bug.cgi?id=1542784](https://bugzilla.mozilla.org/show_bug.cgi?id=1542784)

The bug is already closed in Firefox. Hopefully it'll be here soon.

~~~
clementmas
That's great news. Thanks for the link

~~~
toastal
dom.image-lazy-loading.enabled = true is the about:config flag

------
martin_a
While I think lazy loading is great in itself, I wonder why we haven't come up
with something better before.

There is progressive loading embedded directly into JPEG and if browsers would
prioritize loading of assets through the DOM (order), we wouldn't need any
other solutions.

Or am I wrong here?

------
Kaiyou
I hate lazy-loading so much, since it never keeps up with my scrolling speed.
So I first have to scroll down slowly and wait for everything to load, then go
back to the beginning and start for real. Tried some methods to stop lazy-
loading but none worked reliably.

~~~
chrismorgan
Me too, especially when images fade from blurry messes to the real thing once
it’s on screen, which I find very disconcerting (more disconcerting than I
think I should find it). This is exacerbated by living in Australia, so that
latency on US-hosted sites is 200–400ms, which makes it worse.

I’m honestly hoping that everyone quickly adopts the lazyload attribute, _just
so I can turn it off in one place_.

------
chiefalchemist
Lazy loading is helpful but if the img tags' sizes attribute is mucked up then
less is gained. That is for example, due to the sizes the browser is led to
believe it needs an image that's 100vw when the actual width is 25vw. This
happen quite a bit with WordPress.

Here's a great article on sizes and srcset.

[https://ericportis.com/posts/2014/srcset-
sizes/](https://ericportis.com/posts/2014/srcset-sizes/)

Editorial: Personally, my fear (based on history) is this will ultimately lead
to more bloat, not less. The belief in "oh not to worry, we've got lazy load"
is not a positive overall.

LL is a good thing. But it will likely increase abuse, not mitigate it.

------
stevenwliao
The first mainstream browsers with support: Chrome and _Edge_!

[https://caniuse.com/#feat=loading-lazy-
attr](https://caniuse.com/#feat=loading-lazy-attr)

~~~
dangoor
Edge is based on Chromium, so it's to be expected that they'll move in
lockstep.

~~~
cbhl
Makes me reminiscent of the days when Chrome and Safari were in lockstep (ish)
because they were both based on Webkit.

Imagine if Microsoft, Apple, and Google were all sharing the same browser
engine.

~~~
zelly
It should happen. The web browser is not a particularly interesting problem to
solve. Let's make the base layer the same and distinguish ourselves at a much
higher level. It's like the JVM--would humanity really be benefited by having
multiple competing implementations of the JVM?

This argument may not make sense applied to an operating system/kernel. There
are obvious benefits to having multiple competing operating systems. The
crucial difference between a kernel and a web browser is that the kernel is a
product, whereas the web (ECMA, W3C) is an international standard. So the only
functional differences allowed to exist between implementations are at a very
high level e.g. UX or privacy. The benefits of competing implementations are
from innovation, but innovation in a way that violates the standard is not
allowed, so innovation in functionality happens in the standards space. Where
does innovation matter? In performance. Who can implement the standard with
the best performance? It makes sense to have competition only up to such a
point where a winner becomes 10x better than its competition. After that point
it becomes useless to bet on the losers (save for extreme niches like lynx).
There wouldn't be enough reward to heroically save the tied-for-last-place
losers in a winner-take-all game.

It has to happen eventually. I think it would be quite sad to have flying cars
on Mars and still have people working on rendering HTML.

~~~
thayne
> This argument may not make sense applied to an operating system/kernel.

I don't follow your reasoning. If competing implementations makes sense for
operating systems, wouldn't it also make sense for browsers, which are
basically the equivalent of an operating system for web apps? Conversely, if
there should only be one implementation of a browser, wouldn't it make even
more sense for there to be only one implementation of the operating system, so
there is only one platform for native applications to target?

> whereas the web (ECMA, W3C) is an international standard. So the only
> functional differences allowed to exist between implementations are at a
> very high level e.g. UX or privacy. The benefits of competing
> implementations are from innovation, but innovation in a way that violates
> the standard is not allowed, so innovation in functionality happens in the
> standards space. Where does innovation matter?

I think you might be a little bit confused about how web standardization
happens. Browsers are very much allowed to innovate beyond what is specified
in standards. And in fact most standards are based on features that at least
one browser has already implemented. Innovation drives standards, not the
other way around.

After Internet Explorer won the last browser wars, both the winning
implementation (IE6) and the web standards stagnated, until competition (in
the form of Firefox and more importantly Chrome) came along. I don't want that
to happen again.

Maybe it will be different with Chrome as the winner since for Google uses the
browser as a platform to deploy its own web apps, but it still means the
direction of browser development is primarily decided on by Google, and will
meet Google's needs, which may or may not be the needs of the internet
community as a whole.

~~~
zelly
> Maybe it will be different with Chrome as the winner since for Google uses
> the browser as a platform to deploy its own web apps, but it still means the
> direction of browser development is primarily decided on by Google, and will
> meet Google's needs, which may or may not be the needs of the internet
> community as a whole.

Yes and that company already has huge voting rights on the standards
committees and is the primary benefactor of its "competition". Chrome and the
web is already one and the same.

> I think you might be a little bit confused about how web standardization
> happens. Browsers are very much allowed to innovate beyond what is specified
> in standards. And in fact most standards are based on features that at least
> one browser has already implemented. Innovation drives standards, not the
> other way around.

It mostly comes from demand from the community. Take this lazy loading images
proposal for example. It's only implemented by one vendor:

[https://caniuse.com/#search=lazy](https://caniuse.com/#search=lazy)

Its demand comes from the huge amounts of websites that use lazy loaded images
with their own libraries. The browser vendors did not implement let alone
invent this feature.

> If competing implementations makes sense for operating systems, wouldn't it
> also make sense for browsers, which are basically the equivalent of an
> operating system for web apps?

Because the web is already standardized, already a solved problem. One day the
market will bring forth an ideal operating system, and we will standardize on
that. The analogous event has happened for web browsers. Some people are
understandably in denial, still used to the old religious warfare way tech
ecosystems worked.

------
baybal2
I remember there was a huge protest from the ad industry over that.

This is the time we live in now, web standards are set by the ad industry

~~~
Cthulhu_
Google is the ad industry, so your point is valid. Also the ad industry
basically pays the majority of stuff on the internet. It makes sense that they
get a say in the matter.

~~~
woodrowbarlow
the ad industry doesn't pay for shit. they extract value from users and give
(some of) it to creators. the users (involuntarily) pay. so if the argument is
"whoever pays for it gets a say" \-- we do. ads don't.

------
ksec
Status on Safari [1];

> _Rob Buis 2020-02-13 00:04:07 PST

I was waiting for the spec to land before working again on this. First step is
to fix the tests: [https://github.com/web-platform-
tests/wpt/pull/21773](https://github.com/web-platform-tests/wpt/pull/21773)

I'll incorporate them into
[https://bugs.webkit.org/show_bug.cgi?id=200764](https://bugs.webkit.org/show_bug.cgi?id=200764),
test a bit and hopefully put it up for review soon._

This is exciting. We should have all major browser supporting it within this
year. Fewer Javascript required.

[1]
[https://bugs.webkit.org/show_bug.cgi?id=196698](https://bugs.webkit.org/show_bug.cgi?id=196698)

------
brianzelip
I detest hard coding image sizes. I only want to add `img { width: 100% }` and
that's it.

~~~
wackget
Then you're part of the problem. Failing to add width and height attributes to
images is what makes websites jump around while loading. It's infuriating.

------
est
So this means better user tracking capabilities are now native, even with
Javascript disabled.

You can learn what the user have read on the page before, and deliver more
relevant ads next.

~~~
manigandham
I'm an adtech veteran and I guarantee this kind of tracking is never used.
Just because you have a signal available doesn't mean it's used or even useful
for ads.

~~~
toast0
What advertiser doesn't want to know that their ad is visible?

~~~
manigandham
That's called "viewability" and isn't related to relevance. It's already
solved natively with the _IntersectionObserver_ API which came out years ago
and supports more than just an image. [1]

Lazy loading is entirely different and offers nothing new or useful for ads.

1\. [https://developer.mozilla.org/en-
US/docs/Web/API/Intersectio...](https://developer.mozilla.org/en-
US/docs/Web/API/Intersection_Observer_API)

~~~
Kwantuum
As the first reply in this chain remarks: this is not available without
Javascript.

~~~
manigandham
The point is that native lazy-loading is not useful for ad tracking. If
there's JS then there are other methods to check viewability. If there's no JS
then there are no ads at all.

------
coderheed
It seems like this is bound to have security holes at least for a while,
right?

~~~
thewarpaint
Can you elaborate on what makes you think so?

~~~
coderheed
Oops, I just misunderstood.

