
Promiscuous cookies and their impending death via the SameSite policy - tomwas54
https://www.troyhunt.com/promiscuous-cookies-and-their-impending-death-via-the-samesite-policy/
======
hirsin
Just a nit on this point -

"Quick note on Microsoft's implementation: their first shot at it was buggy
and caused the "None" policy to omit the SameSite cookie attribute altogether"

This isn't a bug, just unfortunate naming and a lack of prescient engineers.
When it was written, None was not a valid enum value (and frankly still isn't)
and the default was Lax. To remove Lax, you set it to None in dotnet, which
did the standard behavior at the time - emit nothing.

If every framework implemented every random extension proposed to every
standard, we'd be in a very messy world with half-baked and sometimes
contradictory standards implemented. Special casing because it's Chrome doing
the breaking change to the standard/Internet is not a precedent I want to see.

~~~
Buge
>and frankly still isn't

It looks to me like it was added on April 9, 2019:

[https://github.com/httpwg/http-
extensions/commit/fa624b1358b...](https://github.com/httpwg/http-
extensions/commit/fa624b1358be1ef77a4fd2560c94593ecf7f508d)

~~~
hirsin
Note that this is the draft, and 6265bis-03 expired in October. While it's
presumed that this proposal will be accepted and become the new standard, a
single commit does not make the standard.

------
robbrown451
Chrome seems to have made one change recently -- maybe related to this? --
that made it so YouTube videos embedded in other sites no longer have you
logged into YouTube.

This means that if you have YouTube Premium, where you pay $10/mo so you don't
have to see ads on YouTube videos, you still see them when the video is on
someone else's site. Oddly, it still works as it used to in Brave.

Since this is sort of cross domain cookie related (YouTube uses an iFrame
typically when embedded, whether embedded by pasting html or by using the
JavaScript API), does anyone know why this happens? If you ask YouTube Premium
support, their canned answer is "we don't guarantee no ads except on the
YouTube site." But they can't explain why it works with no ads in non-Chrome
Browsers.

~~~
nkozyra
_Is_ that cross-domain cookie related? I thought by virtue of being in the
sandbox that is the iframe it has access to the cookies associated with the
domain(s).

It sounds like an excuse to serve you ads, honestly.

~~~
lima
It would break other features (like the "Watch Later" button) too.

According to Occam's Razor, this is very likely to be a bug rather than a
conspiracy against YouTube Premium customers.

~~~
robbrown451
I agree it is probably a bug, partly since other browsers aren't affected.

------
ReidZB
I work for a company whose endpoints end up being embedded in iframes in
external systems, and this change is currently causing no small deal of
heartburn for several folks here.

I really don't like that the Powers That Be have decided to make this change,
for many reasons:

1\. Changing the default behavior of a thing on the 'net that's been around
for so long. Like the article says, I can't wait to see the various and sundry
things that are all broken as a result.

2\. To set SameSite=None and get back to the old behavior (more on this in a
second), you need to do... user agent sniffing, because some browsers (as the
sibling comment by swang says) will choke on None and fall back to Strict.
Great, so _sometimes_ set SameSite=None, sometimes don't set it or else things
will break. _eye roll_

3\. As the article says, SameSite=None is only allowed on Secure cookies,
which means you actually _can 't_ get back to the old behavior. Now, my
company has been telling people for years to stop using HTTP (and customers
have to contact support to even get HTTP support enabled). However, there are
a few enterprise-y holdouts. In several cases, we've had to go be the bearers
of bad news. Albeit, there is an undercurrent of glee in finally forcing them
to stop being bad stewards of data (assuming they don't just enterprise-policy
it away), but still, from a business perspective it's very frustrating.

So, in sum, it'll break stuff that's not updated (my guess: a lot of stuff),
setting SameSite=None requires a user-agent-sniffing hack, and even setting
SameSite=None is not a complete solution if you're using HTTP for some reason.

And for what, exactly? This would've been nice in 1995, but it's a bit late
now. Though, I guess maybe twenty years from now we can rip out (or stop
writing) some anti-CSRF code or something.

~~~
tptacek
You don't fix security problems with the Same Origin Policy because you're
trying to free up some anti-CSRF code to make developer lives easier. You do
it because mistakes with that anti-CSRF code result in vulnerabilities, which
harm users, who are an externality both to developers and standards authors.
And those mistakes happen all the time.

The SameSite change we're talking about decisively mitigates most CSRF
vulnerabilities. Once widely deployed, it probably kills the bug class,
turning it into another bug bounty eye-roller like ClickJacking, rather than
what it is now: a bug that is routinely exploitable on significant sites. It
is more than worth it; it's one of the smartest things the browser vendors
have done in awhile.

~~~
minitech
> turning it into another bug bounty eye-roller like ClickJacking

When did clickjacking get mitigated by default by browsers? As far as I know
it’s still up to websites to prevent framing explicitly.

~~~
tptacek
I'm not saying that CJ has been mitigated by default the way CSRF is poised to
be, but rather that it's very rarely exploitable, which is soon to be the case
for CSRF as well.

------
swang
Adding annoyance to this change is that Safari has a bug where if you send a
value with an invalid parameter in SameSite, ("lax" being considered an
invalid value), Safari will default to Strict (rather than None) and thus
there is another check developers have to handle, which is to user-agent sniff
for a Safari/Webkit browser, then explicitly send "None" or not send the
SameSite value of "Lax"

I'm pretty sure the WebKit team is aware of it but I don't recall a timetable
for a release that addresses the issue, so up to Nov 2019 (when I last look
this up) Safari still has this issue.

~~~
jacobparker
Hey, there are some important mistakes in this warning:

1\. The value that is invalid for older browsers (including older versions of
Chrome!) is None, not Lax. It is always (as far as anyone knows safe to
explicitly set SameSite=Lax in all browsers, assuming your site is ready for
that.

2\. The latest Safari (v13) has changed their behaviour to match the latest
spec.

See this article for details on detecting/dealing with it:
[https://www.chromium.org/updates/same-site/incompatible-
clie...](https://www.chromium.org/updates/same-site/incompatible-clients)

TL;DR: old (but not too old) Chrome responds by rejecting the cookie entirely
(which Google says was an valid interpretation of the spec, at the time of
those versions) and old (but not too old) Safari responds by interpreting the
None value as Strict (I think there is some debate on whether the spec allowed
this back then, but at this point it doesn't matter/I don't care).

~~~
thayne
The latest safari may have changed the behavior. But the bug is tied to the OS
version, and Apple has said they won't backport the new behavior to iOS 12 or
Mac OS 10.14. So people who can't or won't (for example users of old iphones)
upgrade the OS will not get the fix. So user-agent sniffing will probably be
necessary for years.

> See this article for details on detecting/dealing with it:
> [https://www.chromium.org/updates/same-site/incompatible-
> clie...](https://www.chromium.org/updates/same-site/incompatible-clie..).

Yes, they recommend using a few dozen lines of user-agent sniffing code.
Despite the fact that user-agent sniffing is generally considered bad
practice.

~~~
jacobparker
> So user-agent sniffing will probably be necessary for years.

Only if you set None (either to opt-out or to do a None/Strict pair). Setting
Lax doesn't require sniffing.

But yeah, that seems like a safe bet.

------
buboard
100% of the warnings i see are ad cookies from google. We get it google, you
no longer need cookies to track us

~~~
download13
This won't stop cookie-based tracking though, they can just set
"SameSite=None" with their cookies

~~~
nikbackm
Wouldn't that make it trivial to block them all?

~~~
jacobparker
You mean block all SameSite=None cookies? They have legitimate uses too.

Consider that SameSite=Strict even breaks cross-origin links (<a> tags): if a
3rd party site links to you and a user clicks that link, the GET will be sent
without cookies.

To get value out of Strict for typical sites the new pattern is to have two
cookies: one is SameSite=None and allows you to do GET/HEAD/etc. requests
("read-only operations", assuming you are following those parts of the spec)
and one that is SameSite=Strict and allows you to do POST/etc. ("write
operations").

If [https://evil.com](https://evil.com) adds a link to your site (an <a> tag)
you can allow deep linking by only checking for the None cookie. The strict
cookie won't be sent for <a> tags. But POSTs/form-submissions, and any
page/resource you don't want to allow deep-linking for, you would check for
both the cookies.

I've seen this pattern referred to as "reader and writer cookie pairs".

\---

This really is specifically aimed at killing CSRF attacks. It's not about
tracking either way (it's orthogonal to that).

~~~
Buge
Why None instead of Lax? The uses cases you mentioned for the None cookie seem
like they would still work with a Lax cookie.

~~~
jacobparker
Ah, good point. So it depends on your site. Some sites need to do things like
serve embeddable content or be an OAuth identity provider, etc., and
SameSite=None is required in those cases. Sorry for not being more clear about
that.

------
littlestymaar
I like the idea behind the change, but it's the first time I've heard about it
(and I assume I'm not alone), and it's being deployed in less than a month?!
Seriously Google ?

~~~
ergothus
I dont mean to be dismissive, but it has been long announced, discussed, and
noisily in the console.

My honest question would be how you've missed it (because I'm assuming your
missing means others would also reasonably miss this) but I have no idea how
you could know the answer to that.

~~~
bartread
> and noisily in the console.

This is true but not necessarily helpful because what the console is often
noisily complaining about is cookies from Google properties and the like.

For many cases where I'm not making use of those cookies myself that's simply
irrelevant... noise: what I need to understand are changes required for my own
cookies, on which the console has remained silent. Also, not something I'm
going to be paying attention to if I'm debugging something unrelated.

(I'm not saying the console is a bad place to show these warnings - far from
it - but there are plenty of reasons people might not spot them.)

I found out about the changes a while ago through HN but even that was months
after the announcement was made. I don't closely follow announcements from
Google simply because the vast majority of them aren't relevant to me. That
being the case it's quite easy to miss things, or find out about them further
down the line via another source.

~~~
ergothus
I understand the complaint about noise, but the message is fairly explicit as
to what is changing and what you need to do and where to go for more info:

"A cookie associated with a resource at
[http://google.com/](http://google.com/) was set with `SameSite=None` but
without `Secure`. A future release of Chrome will only deliver cookies marked
`SameSite=None` if they are also marked `Secure`. You can review cookies in
developer tools under Application>Storage>Cookies and see more details at
[https://www.chromestatus.com/feature/5633521622188032."](https://www.chromestatus.com/feature/5633521622188032.")

A quick google (ha!) check shows articles from plenty of development and
security blogs (i.e. not from google direct) going back to May (though a LOT
seem to be from the last few months, not sure if that's because chatter picked
up or because Google is giving me more recent results, and I'm too lazy to
experiment - I definitely heard about it from multiple sources before the
original impact date in Oct)

Focusing on the point - Obviously this is a change Google should give "enough"
notice for (both time-wise and breadth-wise). What would you recommend them
doing differently than they have? At the end of the day, I'm not really sure
what they can do that they didn't do - indeed, since they delayed the original
release, there's a real risk of people ceasing to pay attention if you delay
too much.

I'm asking out of curiosity, not accusation.

~~~
bartread
No stress, and I do get it, but it's bound to happen that people don't find
out.

My issue with the console warning is that, as explicit as it is, unless you
know at least some of the background it's not immediately obvious why it's
relevant to me as a developer of mysite.com.

------
zzo38computer
They mention changing your password with a POST request, but at least what I
have seen require the old password to be included in the request too.

Nevertheless, this is also problem of web apps in general. In many cases there
are better protocols and better programs anyways.

In the case of the cookies, there can be user settings; if the user defines a
cookie manually they can define if it is sent with cross site requests or not,
and if the server sends the cookie to you then by default it won't be sent
with cross requests. Cookies would always be sent for <a> links outside of
frames, though, unless the user configures otherwise (such as to disallow it
if there are query strings, for example).

Another thing I thought is a "Web-Option" request header. This is similar to
cookies but cannot be set by the HTTP response nor by document scripts; the
only way to set it is for the user to set it by themself. The response can
include a "Web-Option-Schema" header, which is a link to a file specifying
what options are valid; the user can use this or can specify their own options
which might or might not conform to the schema. (This is not meant for
authentication. For doing authentication, use basic/digest auth instead.)

------
universenz
Another great article from Troy. I personally believe that Google is
intentionally flagging their ad network to ensure a large majority of the
'web' and subsequent developers are made aware of the impending changes.
However, the console does not make it clear that we're talking about a change
that is being implemented in less than a month.

------
EGreg
I wasn’t really following this whole SameSite thing but between Safari and
Chrome and various versions it looks like they made the problem worse.

The idea is great. Basically browser vendors finally realized that most
websites don’t need cookies for cross-site requests so it switched from opt
out via CSRF busting techniques to opt-in.

Except isn’t following cross-site links basically a GET request initiated by a
different referer? So now will the strict mode not have me logged in when
someone follows a link to some site that set it? Is that why the default is
LAX? And under Lax, what about html form posts to top-level documents? That
should go without cookies, right?

------
thayne
I agree that making SameSite=Lax the default is the right thing to do.
However, I think chrome is moving too quickly. Making SameSite=Lax the default
while a substantial amount of users use browsers that don't support
SameSite=None seems like a mistake.

------
zallarak
So these cookies may die but it’s a perpetual arms race. Browser
fingerprinting will (or has) replace(d) cookies for tracking purposes.

~~~
simonw
Browser fingerprinting is a hack, and exploits clear loopholes in browser
privacy models.

I wouldn't rely on it because it's committing to an ongoing arms race against
the browsers. One that I expect them to win.

~~~
nordsieck
> Browser fingerprinting is a hack, and exploits clear loopholes in browser
> privacy models.

> I wouldn't rely on it because it's committing to an ongoing arms race
> against the browsers.

It doesn't seem to me that browsers are trying to win at all. For example, one
of the greatest discriminators - font list - has been known about since people
were talking about browser fingerprinting.

The fix would be pretty easy too: in incognito mode (or when toggled by the
user), only support 2 fonts: 1 serif and 1 san-serif that ship with the
browser on all platforms.

I don't think any of the browsers want to do that.

There are a number of other longstanding fingerprinting issues that are
similarly easy to fix.

~~~
kevin_thibedeau
You'd need a standardized font rendering engine to defeat fingerprinting via
canvas.

~~~
michaelt
"Same canvas image looks the same on every browser" seems like a desirable
state of affairs to me?

~~~
Buge
I think the problem is that canvas can be GPU-accelerated, and GPUs don't have
an exact standard for how each pixel will look.

