
CSRF is really dead - psanford
https://scotthelme.co.uk/csrf-is-really-dead/
======
al2o3cr

        Website operators will need to do nothing to have the powerful
        protection of SameSite by default
    

Alternative phrasing:

    
    
        Website operators who depend on the existing behavior will have to
        make changes to have their sites work in Chrome at all
    

IMO this is a useless default - sites will still need existing CSRF protection
techniques since not everyone runs Chrome - that will marginally increase
security on small fraction of sites (that DGAF about CSRF) and break a small
fraction of sites.

~~~
al2o3cr
It gets better: the linked Chrome feature "Cookies default to SameSite=Lax"
mentions a bug in Safari (fixed in a not-yet-released version) that causes
cookies with "SameSite=None" to not be sent in cross-site requests. So when
Chrome 80 comes out, your options for setting cross-site cookies will look
like:

* send nothing, and break in Chrome 80+ because of the new default

* send "SameSite=None", and break on any Safari besides the latest because of the bug

* send anything else, and break because that's what SameSite does

Heckuva job, Chrome team.

~~~
slowmovintarget
At home I've switched back to Firefox. Seems like I need to complete that
transition at work to continue enjoying things like ad blocking and functional
sites.

In fact, maybe sites should start recommending something like that.
(Microsoft's wallet sense must be twinging.)

~~~
mikepurvis
I switched back to Firefox recently as well— the containers plugin has me
completely won over. It takes a while to get in the groove of it, but surfing
the internet and not being logged into Google and Facebook everywhere is great
and totally worth it.

(I recently also switched back to Windows 10 after a decade+ on Mac OS. I
never got satisfactory performance from Firefox on Mac, so part of trying it
again was hoping that the perf story on Windows was better, which indeed it
is.)

------
Osiris
> When SameSite first came out, nobody wanted to make it the default. It could
> break things, change expected (legacy) functionality and generally speaking
> the worry of breaking stuff stops us making progress.

So, when it first came out, it was not okay to break things to turn it on by
default. But, now it's okay to break things by making it the default.

I'm starting to feel like we're back in the Internet Explorer 6 age (in terms
on having one browser that dominates and can arbitrarily change the
rules/standards of the web).

~~~
VeryHacker
Well, they've got plenty of time to test the feature. Now time's up

------
birdyrooster
I was mocked at at Black Hat when I asked a question several years ago why
people don't consider forward proxying all third party dependencies for the
user. Then we could disable CSR entirely. They said that doing that would
fundamentally break the web. I think that it would make it so your user could
never be MITM or DNS-spoofed to a malicious imposter for that dependent
service or asset.

Can someone speak to those criticisms?

~~~
quickthrower2
Interesting question.

Whether it is a good idea is going to depend on a whole bunch of architectural
factors.

If it's a static JS library, say JQuery, then self host it seems sensible. You
can host on your own CDN if needed.

What about if it is, say a credit card processing service? Well now you are
dealing with credit card data through your site, so perhaps PCI compliance
might apply?

Also you are now being the MITM to avoid an evil MITM - so you have more
responsibility than every to ensure security. Make sure your proxy is secure
as possible. Does it accept bad certificates for example?

~~~
birdyrooster
In theory, we can still enable end-to-end encryption while forward proxying by
a separate service provider, but it might not be easy to implement. The
browser would need to be smart enough to route packets through the website and
verify SSL coming out of the PCI environment. This will require modifications
to the OS to dynamically add static routes for dependencies.

1\. Browser loads the shopping cart page on example.com to transmit credit
card details

2\. User enters credit card details in to the form and hits enter.

3\. The browser resolves the IPs for domain api.creditmerchant.net and adds
static routes for those IPs pointing to example.com

4\. The browser initiates SSL connection to api.creditmerchant.net using
example.com as a static route

5\. The browser verifies the authenticity of the certificate and chooses
ciphers for the conversation

6\. The credit card details are encrypted and transmitted to
api.creditmerchant.net without being exposed to example.com

7\. The purchase is complete and example.com is not in scope for PCI.

*. If the example.com is set as the static route for any unknown dependencies, the traffic is null routed

~~~
xg15
But then, what exactly was won? Seems to me, the request to
api.creditmerchant.net is still a mostly ordinary cross-site requests, except
the packets are routed through example.com - but example.com can't do anything
with the packets, because they are encrypted.

So api.creditmerchant.net could still inject all kinds of malicious or buggy
scripts and example.com can't do anything about it.

The one difference I see is that api.creditmerchant.net could restrict it's
endpoints to only accept packets from example.com addresses - as they should
never be called directly from a browser. This sounds like some ad-hoc CSRF
protection. Was that the intention?

~~~
birdyrooster
Yes I think that was the intention. You reduce the attack surface for
injection attacks because the injected code must route all requests through
example.com. Example.com will implicitly trust api.creditmerchant.net as it
does today, but that dependency once loaded won’t be able to make requests to
untrusted resources categorically because it must route thru example.com.

It seems like a spoiler for chaining attacks.

------
javagram
This is good news I think. Devs often just don’t know or pay attention to
stuff like XSRF in my experience as a professional developer.

It’s not something you notice until your site gets hacked. Or if you do notice
it, it’s in a security audit at the end of development and the ticket to fix
it might get buried in a backlog below the features and bugs users actually
see and want.

Having the protection on by default is the only way to solve this for good,
it’s how cookies always should have been.

Turning cookies to same site by default will definitely break a lot of things
though. I implemented SameSite cookie functionality in a library at work and
we had several issues with it breaking stuff and confusing people when they
updated to the new secure version of the library.

------
V3ritas1337
Calling CSRF dead is the wrong way to interpret it.

There would be few security issues surrounding web apps if devs were security
conscious, unfortunately only few are.

There are many mitigations for security vulnerabilities surrounding web apps,
CSRF is very much alive, and it is a common finding of mine.

~~~
noncoml
I think he means CSRF prevention hacks are dead, because, despite being the
norm, adding a CSRF token in each POST request is indeed an ugly hack.

~~~
edoceo
Not ugly, it's just passing a state-key around - a common practice in loads of
other communication channels.

~~~
AmericanChopper
That doesn’t really fit in with modern web interfaces though. REST and GraphQL
are both stateless architectures. Tracking and mutating a state on every
request doesn’t fit very well with with that.

~~~
MaxBarraclough
Depending on exactly what we're doing, we can work around this with crypto,
right?

A token can securely prove that it was issued by the server/service, and under
what conditions, without the server/service statefully tracking the token
after issuing it.

I know I'm not the first to think of this, but I'm not sure how widely used
this sort of technique is in practice.

~~~
AmericanChopper
That’s exactly how authentication JWTs work. They work differently from CSRF
tokens though, because the CSRF token is generated for each request, not each
‘session’ (so you need much more complex server-side state management). But
that said CSRF isn’t particularly relevant if you’re using a JWT in the
Authorization header. CSRF attacks the fact that browsers will use cookies to
automatically authenticate requests. If you’re not using cookies for
authentication, and adding a JWT to the headers instead, then that automatic
authentication doesn’t occur. Correct me if I’m wrong, but I’m not aware of
any CSRF attack that targets Authorization headers, I believe any attack that
does is just an XSS.

~~~
exelib
Well, where you store your JWT on the client? LocalStorage? Then you have
probably much broader surface for an attacker.

In case of XSS you lost anyway. But with HttpOnly-Cookies the attacker can't
steal your token and do everything from everywhere with your token.

~~~
AmericanChopper
XSS is a problem you have to solve no matter where you store your
authentication material. If you have an XSS, then an attacker can do anything
they want on your web pages. I’m not sure you’re gaining anything by saying
“anything but steal the auth token”. If you store it as a cookie, now you have
to solve XSS and CSRF, so I’d say that makes your attack surface broader.
Especially considering front end frameworks have become very good at
preventing XSS, and can’t do anything about CSRF.

------
hirsin
Has there ever been a breaking change approved by the IETF? I'm not familiar
enough with their history to know of precedence. It seems like the
spec/standard should be approved at least before we go about asking developers
to incur hundreds of millions of dollars in updates so their sites continue
working in Chrome.

Ironically, this breaks CSRF protection in OIDC authentication systems (except
Google, since they don't implement the form_post standard).

------
danShumway
I agree with the change in theory (although as a side-note I wish users had
more control over cookie/request headers, and it wasn't just site-operators
that could set policies). It's a reasonably obvious oversight that cookies
aren't same-site by default, and the thoughts behind this seem pretty solid.
There's even a good point that I've seen made that secure-by-default might
give browsers/blockers a useful metric for identifying likely tracking
cookies.

I have no objections on that front.

I wish the Chromium team was working more with other browsers to make this a
more coordinated change. At the same time, I don't think that there would be
significantly less breakage if it was. No matter what, this is 100% going to
break websites -- there is just no way to roll out a change like this without
disrupting operations. It's gonna be a mess, and it would still be a mess even
if all the browsers rolled out this change together.

It feels kind of like the JS ``typeof null === "object"`` stuff. Everyone
agrees the existing behavior is wrong, but we're not sure when and how to fix
it.

So I'm a little conflicted. I understand completely why the Chromium team
wants this, and I also understand why some people are going to be upset about
it. Blocking CSRF by default in a browser is _really_ good. We'd also really
like to avoid breaking the existing web.

It's complicated.

~~~
thayne
> I wish the Chromium team was working more with other browsers to make this a
> more coordinated change.

The same change is in development for firefox as well:
[https://groups.google.com/forum/#!msg/mozilla.dev.platform/n...](https://groups.google.com/forum/#!msg/mozilla.dev.platform/nx2uP0CzA9k/BNVPWDHsAQAJ)

------
jasonhansel
What we really want is a failsafe way of setting cookies: i.e. a way of
setting cookies that won't work on browsers without SameSite support.
Otherwise this solution will still leave a security hole for the many users
not on (the latest version of) Chrome.

Ideally we could do this by intentionally making the SameSite cookie syntax
non-backwards-compatible.

~~~
feanaro
So... Break the web for anyone not on the latest version of Chrome?

~~~
jasonhansel
No, just give devs the ability to easily catch the error & use an alternative,
secure solution.

~~~
javagram
If there were an easy, alternative secure solution then developers would
already be using it.

Implementing common anti-XSRF mechanisms like a session token in a form field
or extra http header requires modifying every place your app communicates with
the server. It’s anything but easy which is why so many apps/websites not
built with XSRF in mind still have vulnerabilities.

------
tptacek
Previously:
[https://news.ycombinator.com/item?id=19853090](https://news.ycombinator.com/item?id=19853090)

------
osrec
Could we just avoid cookies altogether, and store session info in local
storage?

Sure it takes a bit of JS to pass the data as part of a request, but at least
you're not prone to CSRF issues.

I'm not sure there are many use cases where I really _need_ cookies if I have
local storage and JS available.

~~~
jusob
But any XSS would give access to your authentication token, this is why you
should never store it in local storage. Cookies have the httponly flag that
prevents javascript from accessing the cookie in case of XSS.

~~~
paulddraper
XSS is usually really, really bad anyway.

CSP plus trusted scripts...you should be working hard to prevent XSS.

~~~
earthboundkid
Ideally, yes. In the current advertising market? No.

~~~
paulddraper
I don't know much about advertising.

Can't they be easily handled with an iframe?

------
badrabbit
Forgive my ignorance but from what I read making SameSite=Lax the default is a
chrome only thing at this time right? If so,how is CSRF dead? Are all major
browsers following suit?

~~~
r00fus
Because Google’s position on the web today is like Microsoft 15 years ago -
overwhelmingly dominant.

~~~
badrabbit
Yeah,but even 1% share is a lot given the sample size. I guess people are
accepting monopolies as standard bearers.

------
thayne
I wouldn't say CSRF is dead yet. Chrome is just one browser, until all major
browsers have implemented this (and IE11 is still kicking), you still need to
implement CSRF mitigations as well.

~~~
fulafel
Re IE11 - [https://caniuse.com/#feat=same-site-cookie-
attribute](https://caniuse.com/#feat=same-site-cookie-attribute) says that
IE11 supports it on Windows 10. Still leaves W7+IE11 but that's a much smaller
(and shrinkin) set of users.

------
viraptor
[https://caniuse.com/#feat=same-site-cookie-
attribute](https://caniuse.com/#feat=same-site-cookie-attribute)

As usual, supported almost everywhere, apart from the any IE that's not the
latest, running on an updated windows 10. So we can't really use it unless we
want to leave lots of users insecure.

~~~
jdnenej
"lots of users", unless you are working with enterprise software I wouldn't
call IE lots of users. Just display a banner for IE users informing them that
using the site with IE is insecure and suggest alternatives.

~~~
viraptor
In enterprise IE may be the majority, but for common users IE is still often
the default. Basically if you don't target tech crowd, IE users are your
users. A website I manage (targeted for location only) has 20% IE traffic on
desktop.

~~~
jdnenej
That is unusual because globally IE makes up almost no users.

------
treggle
Clickbait.

This needs a different title.

This is NOT saying CSRG is dead.

It is saying “same site cookies are an alternative to CSRF”.

------
foxyv
Except when your client uses IE8. __pained crying __

------
zeroimpl
Why is the default lax instead of strict?

~~~
rosybox
Because strict would break how people expect the internet to work.

Any links from a third party website to a site where the session cookie is set
with SameSite strict, clicking those links will not include the session cookie
for that site.

For example, if GitHub implemented SameSite strict for its session cookie and
you clicked a link on a site that took you to GitHub, it would not send the
cookie for GitHub and it will look like you're not logged in on GitHub, even
though if you opened a new tab and went to GitHub you would be logged in.

[https://www.owasp.org/index.php/SameSite](https://www.owasp.org/index.php/SameSite)

~~~
zeroimpl
Ah so Strict is super strict, whereas Lax is still stricter than the current
default.

Lax seems like it should have been named something else, like Partial or
Moderate.

