Hacker News new | past | comments | ask | show | jobs | submit login
Speculative prerendering in the browser (web.dev)
38 points by feross on Sept 25, 2021 | hide | past | favorite | 10 comments



Enabling any kind of ”follow links on page before I click on them” feature necessarily executes JS on those pages just to be able to prerender them. I would not trust these features, and suggest everyone I know to turn them off in browser config.


Kind of annoying how Google commandeered generic-looking domains like web.dev and html5rocks.com.


So much this. And when they say things like “your JavaScript needs to avoid this and that to perform well” just because it's a known performance issue in Chrome that no-one else has, and following their advice actually hurts performance in other browser engines…


It’s basically astroturfing… domainturfing?

But I learned Google owns .dev so I don’t really trust any .dev sites now even though 3rd parties can register them now.


better them than w3schools.com


Anyone here use prefetch or these libraries? Seems neat to get top speeds.

> There are also libraries such as quicklink and guess.js, that use heuristics to determine which resources should be prefetched at runtime. With these, developers don't need to guess what should be prefetched. The libraries take the decision based on available data.


I lead frontend development on a medium-sized US eCommerce shop for a few years. We prefetched JS bundles for likely page hits (e.g. grabbing category bundles when users land on the homepage).

It improved performance a decent bit, though the far majority (>85%) of the site's users are on iOS, so RUM didn't show a big effect.* Hopefully the Safari team brings that to a stable release soon!

*also why I dislike the focus from web.dev and family on CrUX. Fully 2 of the 4 Web Vitals are unmeasurable outside of Chromium[0], and CrUX only includes reporting from Chrome users anyway... I really wish they had delayed the SEO impacts of this stuff until there were usable analogs in competing browsers. That that caveat is not surfaced on Lighthouse/PSI reports does not make me happy either.

[0]https://github.com/GoogleChrome/web-vitals#browser-support


using html descriptors (essentially commands) is not really "speculative". I was thinking it was going to be leveraging some ML model to render pages faster if it could predict with high enough certainty some info about what to render, such as structure or size of contents.


Well the speculation is on whether the user will actually click on the link. You're declaring you think it's likely; the browser is free to choose what to do.


As an aside, it’d be interesting to see what kind of requests that domain gets from improper configuration of dev environments, I imagine web.dev is a pretty popular example.com




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: