
Cross-Site Request Forgery is dead - edward
https://scotthelme.co.uk/csrf-is-dead/
======
dfabulich
The "SameSite" cookie parameter is only supported in Chrome, but you can vote
for it in other browsers' issue trackers.

Firefox has an open bug
[https://bugzilla.mozilla.org/show_bug.cgi?id=795346](https://bugzilla.mozilla.org/show_bug.cgi?id=795346)

Microsoft does, too [https://wpdev.uservoice.com/forums/257854-microsoft-edge-
dev...](https://wpdev.uservoice.com/forums/257854-microsoft-edge-
developer/suggestions/17140412-support-samesite-cookie-option)

And so does WebKit
[https://bugs.webkit.org/show_bug.cgi?id=159464](https://bugs.webkit.org/show_bug.cgi?id=159464)
WebKit allows voting, too, by filing duplicate issues in Apple's private
"radar" issue tracker.

Which is to say, the WebKit bug is already filed in Radar as
rdar://problem/27196358

Apple has said publicly that if you want to "vote" for a given Radar issue,
you should file duplicates for that Radar. (I find that weird, but that's the
way they do it.) To do that, go here:
[https://bugreport.apple.com/](https://bugreport.apple.com/)

You can copy and paste the data from OpenRadar, a community tool where people
share Radar issues that they want people to be able to search for and/or
duplicate.
[https://openradar.appspot.com/radar?id=4963174633701376](https://openradar.appspot.com/radar?id=4963174633701376)

Be sure to mention in the bug description that you're filing a duplicate of
rdar://problem/27196358.

EDIT: And while you're in there voting for browser security features, consider
voting for Subresource Integrity on Apple WebKit and Microsoft Edge.

[https://openradar.appspot.com/radar?id=4980317458792448](https://openradar.appspot.com/radar?id=4980317458792448)

[https://wpdev.uservoice.com/forums/257854-microsoft-edge-
dev...](https://wpdev.uservoice.com/forums/257854-microsoft-edge-
developer/suggestions/6263699-subresource-integrity)

~~~
derefr
> Apple has said publicly that if you want to "vote" for a given Radar issue,
> you should file duplicates for that Radar. (I find that weird, but that's
> the way they do it.)

Given that Radar is a private store with a public write-only channel (bug
report submissions), the only way it could work for non-Apple-employees to
vote for something is to request that they describe it again themselves and
then merge all the duplicates on the Apple-private side.

Not saying that Radar being private is not _itself_ kind of weird, but the
submission policy necessarily follows from that.

~~~
mitchty
Radar sucks yes, but thats why open radar got created, to help out
coordinating exactly this kind of thing.

[https://openradar.appspot.com/page/1](https://openradar.appspot.com/page/1)

------
akamaka
Not supported on most browsers, as opposed to what the headline suggests:
[http://caniuse.com/#feat=same-site-cookie-
attribute](http://caniuse.com/#feat=same-site-cookie-attribute)

~~~
gruez
depends what he means by "most". depending what source you look at, chrome has
the majority market share so even if it's the only one to implement it, it
still counts as "most".

~~~
molyss
most : the majority of, nearly all of.

"most browsers" would become "the majority of browsers". If there were 5
browsers, "the majority of 5 browsers" would be "3 browsers or more". It
doesn't make any difference which ones are most used, the sentence (apparently
now changed or removed) wasn't "most of the requests will be protected", or
even "most users will be protected"

~~~
CocaKoala
At the same time, though, you can pretty easily make the argument that the
majority of browsers is Chrome. After all, if there were five internet
browsers and four of them had one user each, it would be a bit silly to say
"Well, most browsers haven't implemented this feature".

~~~
ptx
But if "the majority of browsers" means "Chrome" (following your reasoning) it
would be more than a little silly to ever use that comparatively convoluted
phrase. You would just say "Chrome".

------
homakov
Clickjacking is dead, because we have X-Frame-Options! (said no one ever).

Opt in solution is not a solution. But still useful.

~~~
tptacek
XFO is a pretty solid response to "clickjacking", which is nowhere nearly as
prevalent as CSRF as a result. If you can trivially mitigate a vulnerability
from a single location in your code, that vulnerability doesn't have much life
left in it.

~~~
homakov
Hm, okay, except 90% of my clients still have it :D

~~~
tptacek
Because they don't enable XFO? Do they think they need arbitrary frames?

~~~
homakov
Nope, because they, their frameworks or something in between have no XFO.
Which proves it being ineffective, even after so many years.

~~~
tptacek
What framework support are we talking about? Isn't the beauty of XFO as a
countermeasure is that it doesn't require framework support? Unlike CSRF
protection, you could add XFO in an nginx conf if you really wanted.

~~~
homakov
Last time i checked Express.

It is trivial to add, _as long as you remember to do it_.

Actually it is silly nginx or apache don't serve those by default.

------
skywhopper
Sounds like a great tool. But saying "CRSF is dead" is a sure sign you aren't
taking security problems seriously enough. The post itself describes how the
feature has built-in self-weakening features. So CRSF is dead... so long as
you use this feature on any appropriate cookies, and work around it only
sparingly meanwhile keeping in mind these very common use cases where it badly
breaks expected behavior in a way that will encourage workarounds that
reinstate CRSF risk. But totally dead.

------
tprynn
Other comments have mentioned the obvious issue that you can't use this
feature across multiple browsers. So you still need CSRF tokens, and the title
is just wrong.

He also mentions checking the origin/referrer header. I would strongly
recommend against this strategy; as he says, it doesn't work everywhere.
Specifically, regular form submissions will not include the origin header in
most browsers, and the referrer header is simply not reliable.

More importantly, multiple strategies for CSRF protection is bad. You need to
fall back on tokens anyway, so the "check origin first" method is basically
just an extra bypass for attackers to abuse. Two checks in this case are
significantly worse than one, because if either is broken you are insecure.

------
edgartaor
If I understand well. If the user is not using a browser that respect this new
policy an attack can occur. Even if 99% of my users have a browser that
support SameSite cookies. 1% of the users are still vulnerable. A bank app can
not afford that kind of risk.

So CSRF is not dead after all.

edit: Typo

~~~
naasking
I don't know, at some point not supporting this feature would be equivalent to
a running a browser with a flawed TLS implementation. The burden is on the
user to keep their software up to date.

~~~
edgartaor
Yep. >at some point In the meantime, I think it's better mitigate this kind of
danger at the server side.

------
jondubois
If your app is a Single Page App and it doesn't rely on HTTP/REST (e.g. it
uses WebSockets instead), then you can easily use localStorage instead of
cookies for holding session IDs/JWTs on the client side (and adding your own
logic to pass them to the server) - That's another way to sidestep the issue
and it works in all modern browsers.

It makes a case for companies to use WebSocket-based APIs.

Also localStorage works better on mobile when using frameworks like Cordova,
React Native, PhoneGap and others - That is because your .html files are
usually sitting on the mobile device itself (so the local domain for the file
won't match the one where your REST API/backend is hosted; so cookies won't be
sent and don't work).

~~~
chncdcksn
The big issue using localStorage for authentication token storage is that
values are accessible by any JS from the same origin as the JS that sets the
value. Example: I bundle my JS and it's dependencies into a single file
(bundle.js) using webpack. One of the dependencies in the bundle has malicious
JS that sends values in localStorage to a remote server, or uses the
authentication token to make requests impersonating the user.

~~~
admax88q
If you're shipping malicious JS on your origin, you're fucked anyways.

~~~
jondubois
Exactly, they can still make requests to your back end by hijacking your
session with your cookie; even if you use the httpOnly flag.

The only advantage of the cookie (with httpOnly) in this scenario is that the
malicious code can't access your session ID and use it for later (but they can
still hijack the session in-place without knowing what your session id is)...

Since sessions expire anyway, there is a sense of urgency; because of this, an
effective XSS attack would typically be carried out in-place on the page
(while the session is active). So in practice, there is very little added
security value in the cookie approach.

In my opinion, XSS mitigation is the last barrier of defence.

------
AnimalMuppet
There's a big difference between _mostly dead_ and _all dead_..

~~~
overcast
With all dead, well, with all dead there's usually only one thing you can do.
Go through his clothes and look for loose change.

------
marco1
There's no news here, though, right?

The initial RFC draft had been submitted in April 2015 [1]. It was updated
multiple times [2] but eventually expired in December 2016 [3], unfortunately.

Luckily, that draft has been revived this month and updated again today [4],
so that's probably the only news.

For people who need immediate support in PHP, I published a small library last
June [5].

Still, the major problem today is the lack of browser support. Chrome (desktop
and Android), Opera (desktop and Android) and the Android WebView have long
supported this attribute, but Mozilla, Apple and Microsoft did not ship this
(yet).

[1] [https://tools.ietf.org/html/draft-west-first-party-
cookies-0...](https://tools.ietf.org/html/draft-west-first-party-cookies-00)

[2] [https://tools.ietf.org/html/draft-west-first-party-
cookies-0...](https://tools.ietf.org/html/draft-west-first-party-cookies-07)

[3] [https://tools.ietf.org/html/draft-ietf-httpbis-cookie-
same-s...](https://tools.ietf.org/html/draft-ietf-httpbis-cookie-same-site-00)

[4] [http://httpwg.org/http-extensions/draft-ietf-httpbis-
cookie-...](http://httpwg.org/http-extensions/draft-ietf-httpbis-cookie-same-
site.html)

[5] [https://github.com/delight-im/PHP-Cookie](https://github.com/delight-
im/PHP-Cookie)

------
jwatte
SameSite works if I use the same domain for my API as for my CDN, and don't
integrate third party APIs.

Doesn't seem to actually solve the real security problems with the web, which
is that people don't know how it works, and there is no security model that
works that matches physical security that people do understand.

------
raesene6
This is a handy defence-in-depth mechanism, somewhat like the other cookie
flags (httpOnly and secure).

Once browser support gets a bit better, it would seem like a good idea to
start making use of it on that basis, but I don't think I'd ever rely on it as
the only form of CSRF protection on a site...

------
pmontra
This guy explains how to add that to a Rails application
[https://gist.github.com/will/05cb64dc343296dec4d58b1abbab7aa...](https://gist.github.com/will/05cb64dc343296dec4d58b1abbab7aaf)

------
LukaAl
Help me understand: Strict means I break the basic way in which the web works,
links? It seems to me a useful feature (Ironic!)

I could get that in certain case it is a good idea to be strict (e.g: banking,
e-commerce), but it is a relatively small percentage of the web. And the Lax
policy doesn't solve the issue because, well, someone could always screw and
forgot they don't accept submission on Post.

Don't get me wrong, it is a useful and powerful tool. What I'm simply saying
is that CSRF is paraphrasing Mark Twain right now: "The reports of my death
are greatly exaggerated".

P.s: also the other comments are right...

~~~
Cogito
Strict doesn't break links, it breaks sites that set "expected" cookies to
strict. Those expected cookies won't be available in some situations that
might have been expected, such as navigating to the site directly.

So if news.ycombinator.com cookies were set to strict, they will not be sent
when I open the site, or indeed at all unless I was navigating from within the
site (thus stopping cross origin attacks, but also stopping them being sent on
first page load). If these cookies were used to identify a logged in user, the
user will not be logged in as the cookie would not be sent. The link still
works, but the behaviour is perhaps unexpected.

One solution is to have a trusted, strict cookie, and it is required for any
actions that originate from the origin site - upvoting, new posts, comments,
etc. You then have a second, untrusted, non-strict cookie that is used to
identify a user. As long as this cookie is not used for any trusted
operations, you have restricted the potential attack surface a lot.

None of this breaks links, it only breaks user experience expectations if
utilised naively (like hacker news using strict cookies for their persistent
login tokens).

~~~
LukaAl
Well, from a user perspective it breaks links meaning that they work in an
unexpected way. For sure it is a much-needed improvement, but it is not a fix
it all feature.

~~~
Cogito
The feature itself doesn't break links in that way at all.

If, and only if,

(1) a website uses strict cookies on a cookie

(2) the website assumes that cookie (1) is always available for 'valid
requests'

(3) the website assumes that 'valid requests' (2) include top-level navigation
events (such as typing the website's url into the browser bar)

then any behaviour that relies on the existence of that cookie will break in
some situations.

That is, only a naive use of this feature will cause breakage. Worthwhile
knowing about for developers, but in an ideal world would never cause issues
for users.

In an ideal world there would be a solution without this shortfall, but it
seems like an almost necessary feature due to how some cross origin requests
are made (by spawning a new window etc).

------
aarreedd
I do not see Same-Site Cookies as a great solution. If you use a framework
like Laravel CSRF tokens are setup out of the box. I cannot imagine it being
any simpler.

The two cookie solution needed to fix problems with `SameSite=Strict` is more
complicated than just using CRSF tokens.

And `SameSite=Lax` solution creates a new way for developers to screw up. The
lax setting gives you no CSRF protection on GET requests. It is too easy to
accidentally accept GET requests on a critical form that should be POST only.

~~~
Cogito
I agree that it's still too easy to screw this up using lax same site cookies.

I do think that using them does provide some additional defence in depth, and
specifically provides use that CSRF tokens can't. These are listed under
'additional uses' in the post, and essentially boil down to the fact that
cookies are not sent at all.

In the wild, this would help today with any timing attacks looking to expose
info from if/when a cookie is included in the request.

------
andersonmvd
Not supported in most browsers as said in this thread, but there many other
mitigation strategies that do work. Here's a complete writeup:
[https://dadario.com.br/what-is-csrf/](https://dadario.com.br/what-is-csrf/)

------
deweller
This seems so obvious. Is that just due to hindsight? Why wasn't this
implemented 3 years ago?

~~~
bitexploder
As I mused elsewhere in this thread, why isn't: secure, HttpOnly and SameSite
a default for cookies that you have to take action to disable. Gravity and
backwards compatibility mostly.

------
Cozumel
Wouldn't an attacker just use an older browser that doesn't support 'SameSite'
to launch a CSRF attack?

I feel like I'm missing something but relying on the browser to protect your
site is leaving yourself wide open.

~~~
Aaron1011
> Wouldn't an attacker just use an older browser that doesn't support
> 'SameSite' to launch a CSRF attack?

With a CSRF attack, it's the victim's browser performing the request - which
the attack doesn't have control over.

> I feel like I'm missing something but relying on the browser to protect your
> site is leaving yourself wide open.

Sites can just add in the 'SameSite' attribute in addition to whatever CSRF
mitigation measure they use.

------
stevecalifornia
It's too bad SameSite wasn't built many, many years ago. It's good that we are
moving there now. However, it'll be many years before browser adoption will
allow us to make use of this.

------
Eridrus
Sane web frameworks already solve this in some way, as long as you don't
mutate any sensitive state in non-POST, so this is really a solution in search
of a problem.

~~~
pg314
Many frameworks have had and probably still have bugs in their CSRF handling.
E.g. a quick scan on the Django security issues [1] shows 5 issues involving
CSRF, the latest one in September 2016.

[1]
[https://docs.djangoproject.com/en/1.10/releases/security/](https://docs.djangoproject.com/en/1.10/releases/security/)

~~~
Eridrus
Sure, the fact that browser vendors have better security sense than framework
developers is a plus, but it's not going to be an improvement on the rest.

Most people aren't getting hacked via an obscure Google Analytics + Django
CSRF interaction. Not people are getting hacked via client-side webapp vulns
of any sort anyway.

------
niftich
This is an old idea that is finally coming to fruition.

See a Microsoft paper from 2011 that prominently features cookie isolation
[1]. Or a 2012 proposal in Mozilla's bug tracker [2] which resulted in some
proof-of-concept code early on, a writeup in 2013 hosted on Github [3];
blogged about independently and contemporaneously by others [4], which hit HN
[5].

I've always taken a dim view on cross-domain requests in general [6] and the
sprawling set of specifications (like most security headers) us developers
have to learn and implement properly to stay one step ahead [7], and am not
particularly enthused that this is opt-in instead of a heavy-handed mandate
like some other recently-introduced features, the default opt-in is the more-
secure but essentially session-destroying version meaning it's guaranteed to
encourage a long and impassioned debate about whether Strict or Lax is the
preferred balance.

It's fascinating to go way back to ~2006-2008 and read about when CSRF was
first starting to be recognized by mainstream evangelists, commentators,
developers and decision-makers as a problem instead of a feature of just how
the web works.

This article on DarkReading from 2006 [8] was soon after cited by the OWASP
wiki [9], Jeff Atwood first wrote about it in 2008 [10] and admitted its
sublteness yet seriousness took him by surprise, and yet it's amusing that
going back to 2003 you can find references to CSRF by that name and
instructions on how to protect against it [11] -- the author of the 2003
article, Chris Shiflett, is credited in the announcement about the 2008 Felten
& Zeller paper [12]: _" On the industry side, I'd like to especially thank
Chris Shiflett and Jeremiah Grossman for tirelessly working to educate
developers about CSRF attacks."_

[1] [https://www.microsoft.com/en-
us/research/publication/atlanti...](https://www.microsoft.com/en-
us/research/publication/atlantis-robust-extensible-execution-environments-for-
web-applications\\) [2]
[https://bugzilla.mozilla.org/show_bug.cgi?id=795346](https://bugzilla.mozilla.org/show_bug.cgi?id=795346)
[3] [https://github.com/mozmark/SameDomain-
cookies/blob/master/sa...](https://github.com/mozmark/SameDomain-
cookies/blob/master/samedomain.txt) [4]
[http://homakov.blogspot.com/2013/02/rethinking-cookies-
origi...](http://homakov.blogspot.com/2013/02/rethinking-cookies-
originonly.html) [5]
[https://news.ycombinator.com/item?id=5183460](https://news.ycombinator.com/item?id=5183460)
[6]
[https://hn.algolia.com/?query=niftich%20crossdomain&type=com...](https://hn.algolia.com/?query=niftich%20crossdomain&type=comment)
[7]
[https://hn.algolia.com/?query=niftich%20another%20damn%20hea...](https://hn.algolia.com/?query=niftich%20another%20damn%20header&type=comment)
[8] [http://www.darkreading.com/risk/csrf-vulnerability-a-
sleepin...](http://www.darkreading.com/risk/csrf-vulnerability-a-sleeping-
giant/d/d-id/1128371) [9] [https://www.owasp.org/index.php?title=Cross-
Site_Request_For...](https://www.owasp.org/index.php?title=Cross-
Site_Request_Forgery_\(CSRF\)&direction=next&oldid=10835) [10]
[https://blog.codinghorror.com/cross-site-request-
forgeries-a...](https://blog.codinghorror.com/cross-site-request-forgeries-
and-you/) [11] [http://shiflett.org/articles/foiling-cross-site-
attacks](http://shiflett.org/articles/foiling-cross-site-attacks) [12]
[http://freedom-to-tinker.com/2008/09/29/popular-websites-
vul...](http://freedom-to-tinker.com/2008/09/29/popular-websites-vulnerable-
cross-site-request-forgery-attacks/)

------
SeriousM
Ha, that's a broad statement. I wouldn't trust the browsers handling of the
cookies. And I wouldn't trust the browsers "private" feature either... Ever
heard of evercookie?

~~~
mankyd
evercookie isn't relevant here.

The feature is for site owners to put additional security/trust restrictions
on their cookies.

