
Demystifying CORS - andRyanMiller
https://frontendian.co/cors
======
paol
This post never actually explains why any of this was needed. The closest it
comes is this bit:

"If [...] the same-origin policy for XMLHttpRequests relaxed, said services
could now receive a deluge of DELETE, PUT, etc… requests from any origin"

But why on earth is this a problem? And in any case those services _always_
could receive requests from any origin, browsers aren't the only HTTP clients
in the world. In the old days you could simply proxy through a server if you
needed to get around the same-origin policy, and I'm sure lots of people did.

I'd love if someone could explain why the whole CORS rigmarole was considered
ncessary.

~~~
ThePhysicist
CORS was designed to mitigate e.g. the following attack:

* You are logged at with service X, which uses a Cookie to store your authentication code.

* Service X offers an endpoint to change your password, as well as an endpoint to retrieve your user account.

* You visit malicious website Y, which uses JS to send requests to the "change password" and "view profile" endpoints of service X (which without CORS would be accepted as the browser sends your authentication cookie with them), getting your profile information and changing your password, thereby taking over your account.

With CORS, for requests from third-party domains the browser first sends an
"OPTION" pre-flight request to service X, which would then respond with a set
of CORS headers from which the browser can determine if the given domain
you're on (Y) is allowed to send requests to the service. If not, the request
(e.g. GET, POST, PUT, DELETE) is not even sent. CORS therefore protects the
user from malicious websites while still allowing requests from specific
third-party domains (as there are legitimate use cases for sending API
requests to a third-party website).

Note that CORS does not protect you from user-triggered requests (e.g.
actively submitting an HTML form), for which you need CSRF tokens if you use
cookies for authentication (which are sent automatically with each request).

~~~
jjnoakes
If you need CSRF tokens for forms, etc, then what does CORS give you above
CSRF tokens?

~~~
ThePhysicist
CSRF tokens are usually only used for state-altering requests (POST, GET
etc.), though one could use them for GET requests as well. The reason people
don't do use CSRF for GET is that there's usually no risk involved when
calling a GET endpoint from your browser, as it's not supposed to change the
state of a resource in any way. And since an HTML form submission or included
link will take the user directly to your API, there's no way for the attacker
to extract the information afterwards (which is of course given if the
attacker can make the request asynchronously via Javascript).

You can also use the "Referer" and "Origin" headers to defend your API against
form submission from third-party websites without using any CSRF tokens (as
browsers will include the URL/domain of the site from which the form was
submitted), though there were numerous cases where browsers or e-mail clients
didn't set these headers correctly, so if you really want to be sure the only
way is to use a CSRF token. You can put that token e.g. in a JS-readable
cookie, which will not be accessible on third-party websites but which you can
read out via JS on your domain and then include in the POST/PUT/... request.
If you want to run your code from different domains as well you will have to
provide an endpoint from which users can get the CSRF token though (as Cookies
from your API domain will not be readable there). You will then need to
restrict that endpoint using CORS, as otherwise an attacker will be able to
get a valid token and e.g. inject that into an HTML form which he/she can then
submit.

So CORS and CSRF do different things, but for API-based apps that should be
able to run on multiple (non-sibling) domains you need both mechanisms to
ensure security (and CSRF needs CORS in that case to function).

~~~
fjsolwmv
You meant "POST, PUT, etc"

~~~
ThePhysicist
Yes, GET doesn't make any sense there.

------
valgaze
Alex Hopperman talks about XMLHttpRequest starting as an initiative to port
exchange/outlook email to a browser:
[http://www.alexhopmann.com/xmlhttp.htm](http://www.alexhopmann.com/xmlhttp.htm)

XMLHTTP actually began its life out of the Exchange 2000 team. I had joined
Microsoft in November 1996 and moved to Redmond in the spring of 1997 working
initially on some Internet Standards stuff as related to the future of Outlook
[…] I don’t recall exactly when we started working on Outlook Web Access in
Exchange 2000. I think it was about a year after I joined the team, probably
sometime in late 1998 […] we were already a milestone or two into the Exchange
2000 (or “Platinum”) project and had been carefully ignoring the issue of OWA
[Outlook Web Access] mostly because the old version was such a hack. […] The
basic premise of Outlook Web Access was that you could walk up to any computer
that had the browser on it and just get to your email […] The [XMLHttpRequest]
beta shipped and the OWA team was able to start running forward using the beta
IE5,

~~~
teh_klev
Looks like his site is a bit clobbered, there's an archive.is copy here:

[http://archive.is/7i5l](http://archive.is/7i5l)

You'll likely need to do an "inspect element" over the annoying AdChoices bar
on the left and delete its containing div.

Back in 2000-2002 I was lucky to work with a team that leveraged
XMLHttpRequest (it was an internal corporate thing for a customer so we could
get away with only supporting IE) to build single page apps before much of
this stuff had a name.

------
Ajedi32
I've never understood why so many people seem to find this confusing. CORS is
really simple: if you want third-party sites to be able to access the contents
of a given page, you need to send `Access-Control-Allow-Origin: sitename.com`
in your response. If you don't send that header, the browser will default to
the more secure behavior and just deny access. (Yes, it can get more
complicated when you need to allow credentials or access to other HTTP methods
but the concept itself is dead-simple.)

Maybe people are just confused about the same-origin policy in general? (Which
is not the same thing as CORS, but related.) That's more understandable, but
still really easy to explain; if you're signed into a site or have access to a
non-public (corporate LAN) site, you _don't_ want every third-party site you
visit to have unrestricted access to all your personal data on that site.
Therefore browsers, by default, don't allow third-party access to a given
origin; simple as that.

~~~
3pt14159
People get confused because it's two requests, first the options one, then the
normal one. Their web frameworks are largely built on single connection
handling, but because CORS isn't used by many people it's often a tacked on
afterthought. For example, in Rails they briefly made it impossible to
manually handle Options requests (something that I fixed).

Then there is the core weirdness of subdomains and security headers in
general. Most developers don't really care about security. They'll do what
they're told and like to do a good job overall, but deep down they don't enjoy
spending time thinking about how to pwn the app they're building. They just
want someone to use what they've built and love it.

~~~
Ajedi32
There aren't two requests; at least not normally.

Preflight requests are only needed if you want to send unusual headers in your
request or use HTTP methods other than GET or POST. See
[https://developer.mozilla.org/en-
US/docs/Web/HTTP/CORS#Simpl...](https://developer.mozilla.org/en-
US/docs/Web/HTTP/CORS#Simple_requests)

But you make a good point about developers not caring about security. If I
look at it from that perspective it totally makes sense. If you don't have any
reason to care, CORS headers may just seem like an unnecessary annoyance that
you don't want to bother learning. "Not allowed access? Why? I don't care
about your darned security headers, I just want to make an API request."

~~~
3pt14159
You're right.

I forgot about the simple requests angle because 100% of the requests I make
are non-simple. I need custom headers and JSON Content-Types. Yet again why
this area is so annoying.

------
raesene9
interesting article, although unfortunately it mentions but doesn't currently
cover one of the more mis-understood parts of CORS, which is the Access-
Control-Allow-Credentials part.

The fact that Access-Control-Allow-Origin: * doesn't work with Access-Control-
Allow-Credentials, for example, is something I've seen sites get wrong quite a
lot.

there's a good post which covers it [https://mortoray.com/2014/04/09/allowing-
unlimited-access-wi...](https://mortoray.com/2014/04/09/allowing-unlimited-
access-with-cors/)

------
matharmin
Where CORS got complicated for me was adding custom request and/or response
headers to a cross-origin GET request. Different browsers handles this
differently. For example, Safari would make a pre-flight request if certain
request headers are sent, while Chrome would just send it. And Chrome would
not make a pre-flight request, and won't populate the "Origin" request header,
but will then drop "unknown" response headers (and doesn't give you any way to
force a pre-flight request or Origin header). I eventually figured out you can
just always respond with `Access-Control-Expose-Headers`, even though it
doesn't look like a CORS request, and then Chrome will expose the headers to
the JS.

~~~
bzbarsky
That's really odd. There are shared tests for all this stuff, and browsers
should really be doing it identically.

Did you happen to report bugs on this to browsers? If not, do you happen to
have a link to a page that shows the behavior difference?

~~~
matharmin
Nope, never got around to debug in depth. Will do that when I have some time
again.

~~~
bzbarsky
Thank you! Even just a page that shows the behavior differences, without much
debugging, would be really helpful; this sort of thing is something browsers
very much aim to have working identically.

------
alexisread
Nice article, though it doesn't cover CORS with NTLM/Kerberos auth (ie. .NET
on windows).

To my mind this is the most baffling aspect here, as preflight requests
shouldn't carry an auth token (according to the spec), however IIS is an all-
or-nothing auth which will thus reject the preflight request, unless it is
specifically handled.

Additionally, there are different ways to handle preflighting for owin and
aspnetcore vs .NET framework (IIS hosting), and these methods also change
depending on the technology - MVC, WebAPI, OData.

[edit] Hit return early:

MVC (pure) requests can be handled in the web.config to set custom headers.
WebAPI uses annotations on the controllers. See
[https://stackoverflow.com/questions/29970793/enabling-
cors-t...](https://stackoverflow.com/questions/29970793/enabling-cors-through-
web-config-vs-webapiconfig-and-controller-attributes)

OData follows a different path through, so the above won't work. You would
need to modify Application_BeginRequest() See
[https://stackoverflow.com/questions/31459416/how-to-
enable-c...](https://stackoverflow.com/questions/31459416/how-to-enable-cors-
via-attribute-instead-of-via-application-beginrequest-when-pr)

The above is all for .NET framework (IIS host). For OWIN you need to modify
HttpListener as per [https://stackoverflow.com/questions/42104716/owin-
preflight-...](https://stackoverflow.com/questions/42104716/owin-preflight-
requests-are-not-processed)

For aspnetcore the pipeline again changes: [https://weblog.west-
wind.com/posts/2016/Sep/26/ASPNET-Core-a...](https://weblog.west-
wind.com/posts/2016/Sep/26/ASPNET-Core-and-CORS-Gotchas)

Lastly, all of the above should be using xhr.withCredentials = true; on the
client (javascript) side.

Note that this is not present for breeze-odata4 (I'm patching when I have
time) but is present on Jaydata (though it's a shame Jaydata doesn't play
nicely with webpack, only browserify).

------
zackmorris
My biggest frustration with CORS is that none of it is technically needed.

We could have skipped the whole thing if subresource integrity (SRI) had been
implemented much earlier, possibly in HTTP 1.0, which makes its absence one of
the great blunders of the web:

[https://en.wikipedia.org/wiki/Subresource_Integrity](https://en.wikipedia.org/wiki/Subresource_Integrity)

On top of that, SRI would have allowed us to use standardized versions of
libraries like jQuery and Angular from their home URLs, which would have
reduced the initial load size of single page applications by megabytes via
caching (because those libraries would have already been loaded long ago from
other sites).

Which naturally would have led to content-addressable data, Merkle trees, etc
which would have given us performance more in line with BitTorrent (at least
for commonly used files/scripts).

On top of that, I'm not completely convinced that CORS avoids IP address
spoofing, DNS poisoning etc. It probably needs to piggyback on HTTPS to be
sure, which is a common way to avoid that can of worms so I can't criticize it
too badly for that.

I should add that SRI has one critical flaw in that it's still vulnerable to
user-posted content (like comments) trying to embed a <script> tag containing
a fake hash in the body. I haven't been deep enough in SRI to know if it's
possible to specify hashes off page, maybe someone knows? Otherwise we may
still need something like CORS, or better yet, something that blocks further
includes altogether rather than conflating the issue with focusing on where
they came from.

------
Alex3917
We really need Authorization added to simple headers. Needing to preflight
every single SPA request either causes a huge performance hit or else forces
super hacky workarounds that introduce their own security issues. Not good for
the web.

------
tootie
I feel like that part at the end is very misleading. GET and POST with simple
headers are still subject to CORS. The distinction between a form submit and
an ajax request is that you are reading the response from javascript. Same
origin policy only kicks in when you attempt to read a response from
JavaScript. It's why Fetch API has a 'no-cors' mode where you can make cross-
origin requests willy nilly so long as you don't attempt to read the response
values. The idea is prevent "secret" background requests being sent by
malicious sites. Form submits are handled by the browser and will incur a page
refresh.

------
jancsika
So if I manually pull up an about:blank page and start screwing around in the
devTools console, what are my options for pulling in junk from, say,
Wikipedia?

Is it only jsonp?

~~~
true_religion
You could disable cors for your browser.

------
stealthmodeclan
If you set CORS, you'll see option request on every request. You can set a
cache expiration header on option request to prevent it on each request and
potential denial of service or slow frontend. It's shocking how few developers
know this.

------
sonnyblarney
What I find perplexing is that for this and every other web technology, there
doesn't seem to be a central comprehensive resource that explains it all in
great detail.

It's like we have to pick things up as we go along, from tidbits and articles
here and there, not of it complete or comprehensive, and often there'll be
little mistakes or misleading bits.

~~~
zeusly
I think MDN is doing a great job of becoming the go-to resource for
everything: [https://developer.mozilla.org/en-
US/docs/Web/HTTP/CORS](https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS)

~~~
sonnyblarney
It's the best surely, and that article is decent, but there are so many areas
wherein information is lacking or incomplete. One-line examples for entire
modules or lack of explanation for specific parameters etc..

It's amazing that Google and Apple rely so much on browser use but they can't
publish comprehensive information on them.

