Hacker News new | comments | show | ask | jobs | submit login

Why don't browsers strip cookies when they are doing cross domain javascript fetches?



Lack of focus, despite many years of research, literature, and attempts; interference with problematic techniques that have become really popular when alternatives sucked, like JSONP before CORS was ready, and before C-S-P was even thought of; worry about touching parts of the platform that have essentially been unchanged since the beginning vs. those parts that are fairly new and have in turn evolved quicker.

On subject of the new SameSite cookie, I wrote a post that summarized my views [1]; it doesn't make for good quoting, but I briefly recount the history of CSRF and how its mainstream knowledge came around 2006-2008, some 5 years after the first sources that mention mitigating against it -- but a 2008 academic paper on it credits "(...) Chris Shiflett and Jeremiah Grossman for tirelessly working to educate developers about CSRF attacks (...)" -- Shiflett being same person who first wrote about this in 2003, and Grossman the one who discovered this flaw in Gmail in 2006.

[1] https://news.ycombinator.com/item?id=13691022


There is a newish cookie flag called samesite to do exactly this. Chrome is the only browser to support it though.


I read about this recently. It's hard to believe these cookies didn't exist until 2016.

The biggest problem solved by cookies has always been sessions. samesite is sufficient for most sessions. It seems like samesite should have been the default from the beginning.


Because that's the way internet works and breaking it means breaking a lot of websites. Web security wasn't thought carefully when web was built, it's just a bunch of dirty hacks around most obvious vulnerabilities.


It would be easy to make sending credentials opt-in in a new HTTP or HTML version. The way it's done now is backwards IMHO.

Define httpsb:// do be like https://, but any site may make ajax and similar requests to it (without credentials). Then make some kind of exception (like csrf protection), or use legacy https, in case you need to send cookies.


But an attacker would simply use <script src="https://..", instead of <script src="httpsb://.." ?


Only if that is supported by the site being attacked. If the site only accepts httpsb connections, then the attacker would not have a way in.


If the site accepts httpsb it can as well support the Origin header [0] and the problem is solved.

[0]: https://wiki.mozilla.org/Security/Origin


The whole point is to allow any site to access any other site, just like plain TCP sockets, without stealing your cookies.

If the site wants to access google.com with its own cookies, fine, why not?


Could you elaborate on the "stealing your cookies" part?

Cookies are sent only to the origin that set them and (except XSS attacks) are not revealed to anyone else. So who exactly is stealing them?


Well, currently, nothing. But currently, the web is completely broken.

If you want web-applications to be powerful, and open, you also need to be able to have any web application to access any URL.

Why should only mail.google.com be able to access my emails, and not also my-little-opensource-webmail.com ?

To faciliate that, without also adding cookie stealing back in, you need to allow any website to open standard TCP sockets.


I proposed a header instead of a protocol btw

https://medium.com/@homakov/request-for-a-new-header-state-o...


Sounds good but I suspect it will meet the same fate as XHTML 2: designed to be clean and perfect but in reality it would take to much effort to implement and maintain.

From your professional experience you can probably tell people would rather have slightly insecure site that works and gives profits rather than broken one because SOTA started including some new feature you didn't know...

People would rather enable these individual headers one by one and see their effect. In h2 headers are compressed so it's not a big deal (besides looking ugly).


> SOTA started including some new feature you didn't know

if you sign for 2 versions, changes in 3 would not brake you. and the point is MANY things right now could be safe to turn on for 99.99%, e.g. XFO. So, not much effort


I guess this is what you get if you let an advertisement company define the web.


Because then they end up with a bug in how they do it, and oops.

When developing web applications, you must approach this from the perspective of "what is the oldest, least-secure, most bug-riddled pile of C++ and plugins someone could try to hit this with".

If you want an example of why this has to be the approach, well... six years ago the Django security team got an email from the Rails security team. Turned out something we'd both done in our CSRF protection systems didn't actually work. Protecting against CSRF while allowing XMLHttpRequest (remember this is 2011!) is kind of tricky, and the standard approach was one adopted by a lot of JavaScript toolkits: they'd set a consistent custom header (X-Requested-With) on the request. And since browsers only allowed that to be done on requests which obeyed the same-origin sandbox, it was reliable: you knew if you saw that header, it was an XMLHttpRequest that a browser had vetted for same-origin safety (or that it was someone faking a request outside of a browser, but that's not a CSRF vector).

And then it turned out that thanks to a bug in Flash plus the way browsers handled a certain obscure HTTP status code, you could actually set that header on a request to any domain. Oops, that's a complete CSRF bypass in Rails, Django and I don't even remember how many other things.

That's how we learned that particular lesson about trusting browsers to do the right thing, and I don't see myself ever trusting browser security like that again.


I'd say it's because of advertising mostly, but a lot of similar tech (that is usually ad supported) like Disqus.

It's interesting that today cross-domain sandboxing applies to almost everything except JavaScript. If I load an image cross domain and draw it into a canvas, the contents of that canvas are sandboxes, but I can cheerfully mix and match code across domains too.

Seems like it would be a good thing to do but it would break a ton of stuff.


Having advertisers not tracking you seems like a benefit not a con.


I agree, but of the four major browsers, two are directly underwritten by advertising (Chrome and Mozilla) and Microsoft is moving that way.

Only Apple has backed off advertising as a revenue source, so it basically comes down to Apple being willing to cause massive breakage (the way it did with Flash) in pursuit of a principle. The fact that they enabled ad blockers in mobile safari says they are at least sympathetic to the idea.


Did you mean Mozilla is an ad driven company?


A very large percentage of Mozilla's revenue comes from search engines (recently Yahoo, previously Google) who pay Mozilla to make themselves the default search engine on Firefox. If Firefox users saw no ads and were untrackable, Yahoo would have no reason to pay anymore.

Of course Mozilla doesn't try to force everyone to stick to the defaults, so you're free to change the default search engine and install a bunch of ad-blocking, anti-tracking add-ons.


A benefit for us; a con for those developing or sponsoring the browsers we use.


Isn't that what Safari does with the "Allow from current website only" setting? It defaults to "Allow from websites I visit", which means that only embedded content from sites you've visited before get their cookies, not random new embeds)


Interesting. Dos that mean that trackers like doubkeclick don't work on Safari with the default settings?


Wasn't a while ago news that advertisers were using a hack to bypass this protection in safari, and that caused a bit of uproar?


It's not the same but aren't the httpOnly cookies kind of serve the same purpose? JS can't read these cookies at all?


JS can't (that protects against stealing the token) but the server still receives it even when the request originates from foreign domain. That's the gist of CSRF [0].

[0]: https://en.wikipedia.org/wiki/Cross-site_request_forgery


Because not all websites can federate their users


Can I turn this off?




Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact

Search: