Website operators will need to do nothing to have the powerful
protection of SameSite by default
Website operators who depend on the existing behavior will have to
make changes to have their sites work in Chrome at all
* send nothing, and break in Chrome 80+ because of the new default
* send "SameSite=None", and break on any Safari besides the latest because of the bug
* send anything else, and break because that's what SameSite does
Heckuva job, Chrome team.
The Safari bug is annoying today, but ultimately didn't impact anyone that was spec compliant. Chrome does impact people who are spec compliant.
The notion that vendors are beholden to standards groups is a real problem. It's what got us stuff like Heartbleed.
In these projects rather than try to solve some particular problem or group of problems and use standards on the path to that solution, the project just throws together whatever happened to attract somebody's interest in a standard into a big heap of cool toys without rhyme or reason.
I think we actually could have blindly got lucky with Heartbleed, it could easily have been the case that to make this extension work you needed to add 40 lines of custom code to every program even though it would always be identical boilerplate code. After all it took them years to add a sane API for "Just check the bloody hostname in the certificate matches". But, that isn't how it worked out.
If you compare Python's "batteries included" philosophy, OpenSSL and a few other libraries take something closer to: "I just keep everything in this old cardboard box, try looking in there?". And sure enough there are batteries, although they seem to be covered in a sweet-smelling sticky substance, there is also a broken Gamecube, one cufflink with a brand logo you don't recognise, a chocolate bar dated 1986, a PS/2 to USB adaptor, a C60 cassette, two dried-out PostIt notes, one sock, a 40cm USB cable with a mini-B connector, and the spare fuses from a 2005 Ford Focus...
Developers determine standards, and it would be pointless to make standards any other way because the standards wouldn't ever be implemented if there were no developers who wanted to implement them.
Since the spec is still in draft, changes are expected. This isn't even the first change, we're already on version 3. And this version expires next month, so we're literally, explicitly, due for a new revision.
So the Chrome team wants to make their own users safer. Good for them. In the absurdly stupid scenario where sites are making cookie-mediated blind cross-site POST requests, they'll have to come up with a better interaction model. Once we figure out who these hypothetical idiots are and whether they exist at all, we can all shed a tear for them and the extra 7 hours of work they'll have to do in order to adjust. But in the mean time, all Chrome users become immediately immune to an entire class of vulnerabilities. As far as trade-offs go, I'd take it. Good on you, Chrome team.
Sure, site owners can't just relax yet and call the problem transparently solved; legacy browsers will continue to exist for years. But that's hardly a reason not to make progress.
The spec compliance argument is utter nonsense. You know what else broke the spec? Ad block. No-script. Popup blocking. Private browsing. Third-party cookie blocking. Pretty much everything that has made any particular browser "good" has been a deviation from the spec.
These "hypothetical idiots", in my case, are developers who used the off the shelf libraries and SDKs built to spec, that rely on 3rd party cookies for things like logging in users and logging them out.
You know the number one use case for 3rd party cookies in that scenario? Ensuring no one has injected or otherwise fraudulently initiated an auth request when the IDP posts back to your site. That forgery check is now broken. In the words of a wise gent from a couple years back, congrats, you played yourself.
Want to set samesite in your language of choice? Enjoy setting headers manually now and praying, because even if your framework does have samesite support it's probably going to emit nothing if you choose none, because that's the spec.
I wish more developers understood why this is important.
> To some people, a system with a security vulnerability isn't "working".
Yes, we have seen that argument more than a few times before, usually from someone justifying their personal agenda without regard for its impact on other people, despite the availability of less disruptive options. Ego is a funny thing.
Obviously, there are times when lack of immediate action would have dire consequences, but I think it would be disingenuous to claim that was the norm, or to claim that this Chrome change is an example.
It is indeed worth weighing the cost of making a breaking change, but given that it happens all the time in all sorts of projects, it's clear that only very few projects subscribe to your notion that breaking users are an absolute sin.
Even Linux has broken userspace programs when fixing bugs. It's only happened a few times, but it happens.
In fact, maybe sites should start recommending something like that. (Microsoft's wallet sense must be twinging.)
(I recently also switched back to Windows 10 after a decade+ on Mac OS. I never got satisfactory performance from Firefox on Mac, so part of trying it again was hoping that the perf story on Windows was better, which indeed it is.)
At this point I just don’t trust Google nor Chrome. All the incentives are there for Google to turn Chrome into adware. The evidence is pointing in that direction. And the trust is no longer there.
It’s funny to think about (and this is of course a gross caricature) but Chrome feels more and more like they took all the annoying adware/bloatware from the 2000s and disguised it in a browser—except they figured out how to do it without anybody noticing. Today, we are all Grandma.
> I never invited some third party to run their code on my computer
As far as I know, there's no extension that can entirely prevent a website from loading resources from other another domain name. If there is, I imagine it would cause a lot of websites to lack styling, images/videos not loading, etc.
My intent was more of an "Old Man Yells At Cloud" thing; I was just kvetching because the internet isn't as user-focused as it could be. Calling a bunch of resources from all over the place is a user-hostile behavior, and I'm generally unsympathetic when sites break because browsers narrow the range of user-hostile behaviors they support.
Google can and will do whatever they want and there is no amount of complaining that will actually change their mind. Just lookup how they hamfisted their autoplay into the ecosystem despite breaking every website. YouTube is whitelisted to not need to follow their autoplay guidelines though :)
No they don't. Eight out of the Top Ten websites are not owned by Google.
If Chrome became the browser that broke major websites all the time, people would switch really quick.
> Just lookup how they hamfisted their autoplay into the ecosystem despite breaking every website.
It didn't break "every website", because most websites never had any obnoxious autoplaying media to begin with.
Popup Blockers also broke "every website" that had the guts to open pop ups, most of which were not solicited by the user.
In either case, strong user preference was behind these changes. It didn't need some consortium of conflicting interests to standardize acceptable website behavior.
If users didn't matter and it was all about Google's interests, you wouldn't have adblockers in Chrome. You wouldn't be able to watch Youtube flawlessly without ever seeing any ads.
You will still need explicit CSRF defenses for the next several years. But sites that employ CSRF defenses today still have CSRF vulnerabilities, just like how they still have XSS vulnerabilities despite vdom frameworks and default output filtering. It's better to have a failure mode where modern browsers are effectively immune to the attack than one in which everyone is vulnerable.
So, when it first came out, it was not okay to break things to turn it on by default. But, now it's okay to break things by making it the default.
I'm starting to feel like we're back in the Internet Explorer 6 age (in terms on having one browser that dominates and can arbitrarily change the rules/standards of the web).
Can someone speak to those criticisms?
However if the standard were changed to disallow CSR in browser implementations then existing sites would be broken, i.e. it would break the web. Sometimes that's worth it, sometimes it isn't, and it's a careful balance between how many sites would be broken vs the impact of the security vulnerabilities that would be prevented.
(disclosure, I work at Google, but not on Chrome (any more) and have no particular knowledge of the CSRF cookie change)
Whether it is a good idea is going to depend on a whole bunch of architectural factors.
If it's a static JS library, say JQuery, then self host it seems sensible. You can host on your own CDN if needed.
What about if it is, say a credit card processing service? Well now you are dealing with credit card data through your site, so perhaps PCI compliance might apply?
Also you are now being the MITM to avoid an evil MITM - so you have more responsibility than every to ensure security. Make sure your proxy is secure as possible. Does it accept bad certificates for example?
1. Browser loads the shopping cart page on example.com to transmit credit card details
2. User enters credit card details in to the form and hits enter.
3. The browser resolves the IPs for domain api.creditmerchant.net and adds static routes for those IPs pointing to example.com
4. The browser initiates SSL connection to api.creditmerchant.net using example.com as a static route
5. The browser verifies the authenticity of the certificate and chooses ciphers for the conversation
6. The credit card details are encrypted and transmitted to api.creditmerchant.net without being exposed to example.com
7. The purchase is complete and example.com is not in scope for PCI.
*. If the example.com is set as the static route for any unknown dependencies, the traffic is null routed
So api.creditmerchant.net could still inject all kinds of malicious or buggy scripts and example.com can't do anything about it.
The one difference I see is that api.creditmerchant.net could restrict it's endpoints to only accept packets from example.com addresses - as they should never be called directly from a browser. This sounds like some ad-hoc CSRF protection. Was that the intention?
It seems like a spoiler for chaining attacks.
I think the "strict" setting is for the proposed "two sessions cookies" design pattern: You have a basic "I'm logged in" cookie with SameSite=lax and a second "I can do things" cookie with SameSite=strict. The first cookie gives you read access to your account and tracks your session - however, to perform any actions on your account, you need both cookies.
If you e.g. clicked on a link to facebook.com and had previously logged in, the browser will send the "lax" cookie with the request, so that your session is correctly tracked and Facebook shows you the logged-in view.
However, the "strict" cookie will NOT be sent, so even if by some accident you have a state-changing endpoint that accepts GETs, an attacker could still not trick you into invoking it via a link.
I agree though that if you have state-changing GETs, you'll have bigger problems, so the use-case for "strict" cookies seems a bit of a niche. I suspect the common case will be "lax" everywhere and "none" for special applications - i.e. what the default is encouraging.
It’s not something you notice until your site gets hacked. Or if you do notice it, it’s in a security audit at the end of development and the ticket to fix it might get buried in a backlog below the features and bugs users actually see and want.
Having the protection on by default is the only way to solve this for good, it’s how cookies always should have been.
Turning cookies to same site by default will definitely break a lot of things though. I implemented SameSite cookie functionality in a library at work and we had several issues with it breaking stuff and confusing people when they updated to the new secure version of the library.
There would be few security issues surrounding web apps if devs were security conscious, unfortunately only few are.
There are many mitigations for security vulnerabilities surrounding web apps, CSRF is very much alive, and it is a common finding of mine.
A token can securely prove that it was issued by the server/service, and under what conditions, without the server/service statefully tracking the token after issuing it.
I know I'm not the first to think of this, but I'm not sure how widely used this sort of technique is in practice.
In case of XSS you lost anyway. But with HttpOnly-Cookies the attacker can't steal your token and do everything from everywhere with your token.
Ironically, this breaks CSRF protection in OIDC authentication systems (except Google, since they don't implement the form_post standard).
I have no objections on that front.
I wish the Chromium team was working more with other browsers to make this a more coordinated change. At the same time, I don't think that there would be significantly less breakage if it was. No matter what, this is 100% going to break websites -- there is just no way to roll out a change like this without disrupting operations. It's gonna be a mess, and it would still be a mess even if all the browsers rolled out this change together.
It feels kind of like the JS ``typeof null === "object"`` stuff. Everyone agrees the existing behavior is wrong, but we're not sure when and how to fix it.
So I'm a little conflicted. I understand completely why the Chromium team wants this, and I also understand why some people are going to be upset about it. Blocking CSRF by default in a browser is really good. We'd also really like to avoid breaking the existing web.
The same change is in development for firefox as well: https://groups.google.com/forum/#!msg/mozilla.dev.platform/n...
Ideally we could do this by intentionally making the SameSite cookie syntax non-backwards-compatible.
Implementing common anti-XSRF mechanisms like a session token in a form field or extra http header requires modifying every place your app communicates with the server. It’s anything but easy which is why so many apps/websites not built with XSRF in mind still have vulnerabilities.
Sure it takes a bit of JS to pass the data as part of a request, but at least you're not prone to CSRF issues.
I'm not sure there are many use cases where I really need cookies if I have local storage and JS available.
James Kettle wrote a good blog post that argues that webstorage is probably a better spot for session tokens here: https://portswigger.net/blog/web-storage-the-lesser-evil-for...
CSP plus trusted scripts...you should be working hard to prevent XSS.
Can't they be easily handled with an iframe?
Several possible issues there:
- If the session is large, it eats space on the user's machine and bandwidth in requests
- The session can't be shared across devices
- Security concerns. You don't want to trust the user to tell you what their current state is - especially if it's "I have this much money in my account" and the like. Even if you encrypted the data, they could resend the same state at a later time - "oh look, I have a full wallet again!"
You're much safer if all the user sends is "here's who I am" and every bit of associated information is under your control server-side.
I'm certainly not anti-JS, I like the language quite a bit and often stick up for it when it gets bashed on HN. But the ability to fall back on non-JS solutions to some problems is an important part of the web, and I wouldn't like to see it disappear. Particularly while we're in the middle of a fight over user-tracking. I think it's important to support graceful fallbacks, and cookies are a pretty good way of doing that.
Each step there will make you slightly safer, depending on what percentage of malicious code you want to block. Sometimes trackers are served as 1st-party requests.
So it's not an indefinite, "no web-code ever" position. It's "be more careful than usual, because an abnormally high number of bad actors are focused on this platform, and not everything is safe-by-default." Ignoring the debate over site-breakage, the changes here around CSRF should be a decent step in that direction.
But I would guess most people don't fall into that category, that's probably just me being weird.
As usual, supported almost everywhere, apart from the any IE that's not the latest, running on an updated windows 10. So we can't really use it unless we want to leave lots of users insecure.
I have no idea why samesite capability would ever depend on the host OS. Shame on Microsoft for not supporting a somewhat simple to implement security feature on older (but supported) operating systems.
This needs a different title.
This is NOT saying CSRG is dead.
It is saying “same site cookies are an alternative to CSRF”.
Any links from a third party website to a site where the session cookie is set with SameSite strict, clicking those links will not include the session cookie for that site.
For example, if GitHub implemented SameSite strict for its session cookie and you clicked a link on a site that took you to GitHub, it would not send the cookie for GitHub and it will look like you're not logged in on GitHub, even though if you opened a new tab and went to GitHub you would be logged in.
Lax seems like it should have been named something else, like Partial or Moderate.