

Nginx JSON hacks - Kenan
http://www.gabrielweinberg.com/blog/2011/07/nginx-json-hacks.html

======
tlrobinson
I'm not sure I understand how some of these are being used, specifically
whether they're being used safely.

API keys usually exist for preventing abuse. By exposing an authenticated API
proxy you're allowing anyone to abuse someone else's API using your key,
likely leading to them banning you. At a minimum you should implement your own
rate limiting.

Likewise, with a JSONP proxy you're allowing other sites to circumvent the
browsers' same origin policy to access that API, which could also lead to
abuse. At a minimum you should restrict requests to ones with a recognized
HTTP Referer header.

Of course with JSONP you also need to trust that the API isn't doing anything
malicious, like injecting cookie or other data-stealing JavaScript instead of
valid JSON. It would be a good idea to validate that the response is indeed
JSON before passing it back to the clients (actually a properly restricted
sanitizing JSONP proxy would be a good idea even if the API already provides
JSONP)

So use these techniques cautiously...

~~~
gnubardt
_API keys usually exist for preventing abuse. By exposing an authenticated API
proxy you're allowing anyone to abuse someone else's API using your key,
likely leading to them banning you. At a minimum you should implement your own
rate limiting._

The article describes how to set up caching in nginx, so not every request to
nginx goes to the external service.

~~~
tdfx
The potential for abuse remains.

~~~
RyanKearney
Then don't use it. As someone who has never had much experience dealing with
Nginx I found this article to be very informative. Most people are smart
enough not make public their third party apis. Those who aren't will quickly
learn of the problems this may have. However, there are some cases where this
could be extremely useful, such as an intranet.

~~~
tdfx
I agree the article is very informative and taught me a number of things I
didn't know about nginx. Unfortunately letting people learn the hard way tends
to not work out well when it comes to security.

------
grourk
If you need to proxy through nginx on your own server to pad some external
api's json (to enable jsonp), wouldn't you be able to put that server on your
own domain? Obviating the need for cross-domain jsonp?

------
cdcarter
Great article on how to get Nginx to JSONP for you, but I question the idea
that setting up a proxy to a paid API with your client key embedded is a good
idea... Seems easy to abuse.

~~~
njharman
I often have to create a service/api to serve (and cache) 3rd party content
cause of rate limiting. a few servers much < than 10,000's of clients. This
makes a few days of work/testing into a few 10's of minutes.

------
riffraff
Couldn't the last trick be improved? since you have access to params you could
possibly use those as in

    
    
        location ^~ /ext_api3/ {
          echo_before_body $arg_callback //+somehow add "(";
          proxy_pass http://api.external.com/;
          echo_after_body ');';
        }

~~~
tlrobinson
Indeed, the example code isn't actually JSONP, since it hardcodes the
callback.

~~~
jordansjones
Which I would guess works for his purposes.

------
olliej
JSONP has numerous problems, first it's not JSON -- the object literal is
created by executing JS, so you're back to square one with validating your
input, the second and more significant problem is that if you ever use JSONP
to transmit private data then you've essentially lost the cross origin
protection of data present in the browser.

Now all an attacker needs to do is get one of your users with an active
session to load a page that does <script
src="usersprivatedata.jsonp"></script> and they can gather that data.

~~~
jhaglund
I was doing something similar to this article (feeding json directly to JS
functions) with PHP. I think this is better for the browser. server side, it's
heavier:

    
    
        <?php 
        //inject json
        //this puts the json in a div
        //then the js gets the div's innerText and parses it as JSON
        //so there's an object in js (global) with all the JSON data, with jsonInject's key name
        if(isset($jsonInject) && count($jsonInject) > 0){
          foreach($jsonInject as $i=>$json){
            echo "<div style='display:none;' id='jsonInject_{$i}'>{$json}</div>"
            echo "<script>var {$i} = JSON.parse( $('#jsonInject_{$i}')[0].innerText ); </script>";
          }
        }
        ?>
    

fill the associative $jsonInject array with json strings in your controller.
then this goes in your templater. it makes a hidden div, puts the json string
in there. then jquery pulls the innerText value, it gets parsed as json and
stored in a js var. depending on use, i'll sometimes just pass the results of
JSON.parse to an initialize method.

I like the proxy caching. I think apache's mod_proxy and mod_cache could do
this but with more config (and resources per request).

------
bartoszpietrzak
If anybody is concerned about API rate limiting, here's a nifty solution.
[http://codetunes.com/2011/07/26/outbound-api-rate-limits-
the...](http://codetunes.com/2011/07/26/outbound-api-rate-limits-the-nginx-
way)

------
dreamdu5t
Use node.js instead!

~~~
pjscott
It's true that everything in this article could have been done with node.js,
but the nginx configuration is pretty trivial, and nginx is _good_ software.
It's stable, it's plenty fast, and you don't have to worry about a new release
breaking everything.

(I have nothing against node, mind you. I've used it for something similar to
Google Wave, and it was very straightforward and capable.)

