Hacker News new | past | comments | ask | show | jobs | submit login

Triaging issues, sure. Progressive enhancement? If you’re parsing User-Agent to implement that, you’re doing it wrong. (Feature detection is the correct approach, when necessary.)

There are cases like IndexedDB bugs in certain versions of Safari that are very hard to feature detect...

Can’t feature detect during SSR, so making an educated guess and falling back gracefully (when possible—many times it just isn’t, as with ES6 syntax) is important to get initial page load right by default.

Because so many people abused the user agent header for things that could gracefully fall back (or they just made invalid assumptions from it) it was made an unreliable indicator for times you actually want to send shimmed content on first load.

Here's an official Google page telling you to sniff UAs or their new feature will break everything: https://www.chromium.org/updates/same-site/incompatible-clie...

Edit: OK, this isn't progressive enhancement. But it's still a major problem you have to sniff the User-Agent for. I don't want to sniff UAs either but when Chrome is wont to change how fundamental parts of the web work like cookies it's sometimes necessary. I know UA hints will still allow you to do this, but requiring a second request is going to make things like redirect pages difficult to implement.

SameSite=None isn’t progressive enhancement.

You're right, I should have replied elsewhere. By the way, if anyone knows what this _is_ called I would be interested to know. As far as I can see it's basically feature detection with no other way of detecting it besides the UA.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact