Hacker News new | past | comments | ask | show | jobs | submit login

I'd still like a way to feature detect, rather than make a round-trip to the browser. This would let me embed webasm rather than js+branch to begin a second download if feature found... etc

I suggest UA string be a bitmask of features. Then feature detection should stop being broken

Extra bits could be used for js-on/js-off, and is-bot/is-human

--

Ah I see they're kind of doing the bitmask, but keeping a round-trip, and making things complicated (though I realize latest http standards can probably remove those round-trips in the average case)

I'd still suggest the bitmask for non-sensitive information, and have everything else simply js-tested as it currently is

Maybe is-user-blind might be a nice bit too, since canvas based websites could switch to the dom, or whatever






Please could we also have a couple more privacy setting bits for i-accept-your-cookies and i-want-to-be-told-about-cookies-on-every-single-website-because-i-forget-what-they-are-and-really-want-to-click-through-to-your-privacy-settings

If we have those bits, then the user can make a set of choices once, for every site, and we get rid of cookie pop-ups

-- Websites could still ask if they want/need to do something that violates those choices


Or we can just assume like reasonable adults that websites are going to put cookies in your browser and promote privacy-oriented tech to users rather than trying to pretend that having every website ask for permission in order to enable basic functionality solves anything.

> promote privacy-oriented tech to users

Like what?

> every website ask for permission in order to enable basic functionality

I don't believe that purely functional cookies require GDPR permission - that's covered by "provide services to the user". It's the ones which are functionality to third parties not the user which are the problem.


> I don't believe that purely functional cookies require GDPR permission - that's covered by "provide services to the user". It's the ones which are functionality to third parties not the user which are the problem.

Ah, I didn't realize that. Well, that does sound much more reasonable.


Actually the ICO page itself presents a great example: if you go to https://ico.org.uk/for-organisations/guide-to-data-protectio... you get:

> Necessary cookies

> Necessary cookies enable core functionality such as security, network management, and accessibility. You may disable these by changing your browser settings, but this may affect how the website functions.

> Analytics cookies [toggle On/Off]

> We'd like to set Google Analytics cookies to help us to improve our website by collecting and reporting information on how you use it. The cookies collect information in a way that does not directly identify anyone. For more information on how these cookies work, please see our 'Cookies page'.

The implication is that a consent dialog would not be required if they weren't using Google Analytics or any other third-party.


That would be nice. I feel like the "cookie warnings" basically read as "this site doesn't actually need cookies to work, but we want to track you". We should just have some sort of "do not track" header that indicates we don't accept those terms, and then websites can badger us if they really need cookies, like for logins.

> I feel like the "cookie warnings" basically read as "this site doesn't actually need cookies to work, but we want to track you".

I typically read those warnings as reminders that I should open the site in a FF container.


> If we have those bits, then the user can make a set of choices once, for every site, and we get rid of cookie pop-ups

That one was tried with the DNT bit - of course users ended up en masse setting it to "do not track" by default. Sites won't accept that.


Sites need to be told to obey DNT with a legal sledgehammer. Still hoping...

The eprivacy regulation, taking care of that, was supposed to be finished by the time the GDPR went into force. But Austria's pro-business government managed to delay it until there wasn't enough time before the last European Elections.

> This would let me embed webasm rather than js+branch to begin a second download if feature found... etc

This is exactly what you aren't supposed to do

Either send the wasm optimistically & fallback to js on error, or send it reactively with js+branch


Currently, yes, you are right. But what I'm suggesting is that current situation could be improved

A http header bitmask set by javascript:testWASM(), would be equivalent to what you suggest...

But avoid the js, and making a second trip to the server


The proposed UA Client Hints is more privacy oriented way to get some diagnostics data and detection from browsers

https://wicg.github.io/ua-client-hints/




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | Legal | Apply to YC | Contact

Search: