Hacker News new | past | comments | ask | show | jobs | submit login

I have been on receiving end of browser bugs, developing cutting edge JavaScript applications.

It's VERY useful to be able to use the UA string to do something like:

if (Chrome version X) then { do this crazy workaround because Chrome is broken }

No. People don't even necessarily update to the latest version. The "Samsung Browser" was notorious for seemingly always be some old fork of Chrome or Chromium that would never update.

That approach is fragile, because it won't stop applying the crazy workaround after Chrome version Y fixes the problem. You can't know which version it's going to be, and there's a chance you won't be maintaining that code anymore then.

But if you can't sniff the browser version, you might still have to apply the crazy work around forever because you have no way of knowing when to stop. It works both ways.

The correct approach is to use feature detection. There are libraries available for it, such as Modernizr [1].

[1] https://modernizr.com/docs/#what-is-feature-detection

You can't always feature detect bugs. I'm not sure why this entire thread is filled "use feature detection" when it's been historically clear that it isn't sufficient.

Part of the proposed deprecation of the useragent is a JS API to expose the information.

Which is silly since the information that's needed to work around version specific bugs is ($vendor, $version) so people will just end up depending on that and it will start the cycle again.

I appreciate the idealism of "if your standard's compliant site doesn't work in Chrome 142 then it's Google's problem" but I can't just throw up my hands and let my site be broken.

> so people will just end up depending on that and it will start the cycle again.

The User Hints spec specifically says that user agents aren't required to provide them in all circumstances, and that they shouldn't be provided in some.

> User agents ought to exercise judgement before granting access to this information, and MAY impose restrictions above and beyond the secure transport and delegation requirements noted above.

The message from Google is clear: Whilst we understand you want to work around bugs, and this is the only way to do it, you really should be second-guessing whether you need to touch this API at all.

Great, now my server side rendering code needs to inject JS into the customer to feedback agent capabilities.

Or check the proposed headers that will also get sent: Sec-CH-Arch, Sec-CH-Model, Sec-CH-Platform, Sec-CH-UA, Sec-CH-Mobile.

Sec-CH-Mobile ? - please tell me the meaning of this is "user agent changes position, refresh positional data often" (checks the specs)

  user agent prefers a "mobile" user experience.
the spec is a tautology using air-quotes :/

That makes sense to me. Different websites will have different expectations of what specifics a mobile experience will need. It may just be different style sheets, or the dev will need to take into account the activity of their particular site, such as reducing background connections to preserve battery life, etc.

Surprisingly i do know what >> "mobile" experience << means in this context and can therefor make sense of this, too. I am just making fun of it by pretending i am reading a technical document that is supposed to specify how to build a sane and reliable system ;-)

A JS API that tells you which bugs exist in the browser?

> Where feature detection fails developers, UA Client Hints are the right path forward.

UA Client Hints will provide: brand, major version, full version, platform brand and architecture, platform architecture, model and whether it should be treated as a mobile platform.

If you were using the UserAgent to guess if the browser was a particular version with a particular bug, now you don't have to parse something and probably get it wrong.

> UA Client Hints will provide: brand, major version, ...

It seems you are using "Mozilla 5.0". Your browser is really out of date!

Please consider updating to rand("Googlebot/2.1", "Version/7", "XUL/97", "HTML/5").

There's no guarantee the user's browser will just hand over that information with the new spec:

> User agents ought to exercise judgement before granting access to this information, and MAY impose restrictions above and beyond the secure transport and delegation requirements noted above.

You can't always detect browser version, either.

Hard problems are hard, but my experience, and the recommended approach by browser implementers, is that feature detection works better.

The fundamental issue with this is it tells you about what's there when what you need to know is what's broken (not even necessarily what's missing).

But this is only accessible on the client side, what about all the other backcompat code on the server side (like the SameSite issue mentioned elsewhere in the thread?) Or server-side rendering? Are we not supposed to care about progressive enhancement anymore?

I build HTML5 games and I need to use very specific hacks for certain browsers, e.g. use a different offscreen canvas implementation for Safari due to performance. I can't use feature detection since it's not about a missing feature.

You can benchmark the alternatives for 2 seconds and choose the faster. Call it: performance detection

For that to work, browsers would also have to accurately advertise their own bugs/quirks, which might be a difficult ask.

Wouldn't those quirks be sufficient to fingerprint in the same way?

What if the DOM looks perfectly normal, but doesn't render correctly?

In the short-term this might be annoying, but in the longer term this is going to force browsers to adhere more closely to standards. When sites are less able to write different code for different browsers, the onus to fix inconsistencies will transfer from site-maintainers to browser-maintainers.

In 5 years you might look back and realize you no longer write browser-specific code any more.

> In 5 years you might look back and realize you no longer write browser-specific code any more.

We said the same thing when IE started losing market share a decade ago.

The problem is, everyone who makes a browser thinks they have the best ideas in the world, and then impliment them, but their users never blame the browser.

If someone is trying to access your site and it breaks, do you really think they're going to say, "dang, I should really tell Chrome to fix their incompatibility".

No, they will always assume the error lies with the site owner.

In such a world, it is not standards that win. It is the predominant browser that wins.

Are you old enough to remember the "best viewed with Netscape" badges that were everywhere in the 90s?

Haha, brings back the old days. I had my sites plastered with "best viewed with your eyes" badges, but that never took hold :)

I have been involved with Web development in some form since late 90's, nice utopia that will never happen.

Unless in 5 years we have only Chrome left, then surely.

+1. The number of rendering invalidation bugs I had to work around in mobile Safari is crazy.

`if(affectedBrowserUA) { elem.style.transform = 'translate3d(0,0,0); elem.style.transform = ''; }`

Edit: I guess you can achieve the same thing through UA client hints. Just need to add a new JS file that reads the request headers and maps them into JS globals.

There's a proposed API for JavaScript to get the same information.

Oh, yup, I see that now. That was the next point after the headers one in the spec.

I'd argue that if you're doing this you can't have tested your JavaScript on many platforms. This breaks very quickly in my experience. Feature detection[1] is a much better strategy. That can get complicated too, but often there are third-party polyfills that can do that for you.

[1] https://developer.mozilla.org/en-US/docs/Learn/Tools_and_tes...

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact