
Is It Too Late To Change JSON? - johns
http://haacked.com/archive/2009/06/26/too-late-to-change-json.aspx
======
ajross
I'm not sure I understand the analysis. Surely this vulnerability (is there an
exploit?) is a problem with the _security model_ , not the fact that JSON data
is exectuable javascript code.

As I understand it, this exploits the fact that some browsers (which?) allow
one site to overload a handler that then sees data from other sites. How is
that not a huge XSS hole right there? Why drag JSON into this?

------
defunkt
I feel as though adding new syntax to JSON is antithetical to the whole idea.

    
    
        { json: [ my stuff... ]}
    

Isn't that sufficient? Use that when returning sensitive data, just normal
arrays when not dealing with sensitive info.

------
blasdel
There's only one reason JSON was remotely feasible as a "standard" -- there
were no decisions to be made, no arguments over syntax to be had.

Every blog post advocating for petty changes to JSON completely misses the
point.

------
CalmQuiet
I'm not sure why his earlier post on JSON vulnerability wasn't picked up here,
but as a web dev who's on verge of having to get JSON-savvy, I'd really
appreciate expert assessment from some of you on this and on haacker.com's
day-before post: <http://haacked.com/archive/2009/06/25/json-hijacking.aspx>
Gracias.

Edit: Or maybe respond to earlier post, now linked at HN:
<http://news.ycombinator.com/item?id=675678>

~~~
uriel
As others have pointed out, this has zero to do with JSON, this is a browser
XSS vulnerability.

------
psygnisfive
The only reason JSON script injection is a problem is because nitwits don't
use a library that parses the JSON code first to verify that it's 100% pure
JSON. Crocker's JSON library fully verifies pure JSON-ness before eval-ing the
string (which is technically an unnecessary step).

Furthermore, why should we worry about changing JSON? It's not amazingly
difficult to construct a parser for some custom document format like this,
after all. I use a subset of Ruby as a data interchange format because of the
ability to use arbitrary values as hash keys, and the ability to have symbols
that are distinct from strings. The parser for it is maybe 80 lines at most,
and doesn't rely on Ruby internals to handle it. Because Ruby also doesn't
have a Date/Time literal notation, I have a way of denoting that
(datetime(...)) which the parser also handles.

There's no reason people should be talking about this, it's a non-issue. It's
never too late to change JSON, or to even do something completely different
that better suits your needs. Just write the parser for it!

And if the issue is "OMG interoperability! :(", it's not terribly diffuclt to
provide your developers with the JavaScript lib to handle the parsing.

------
rgrove
There's no need to change JSON. This is a non-issue if you adhere to a few
basic security practices:

1\. Use crumbs to protect against XSRF.

2\. Responses that should not be cached by proxies should have a "Cache-
Control: private" header.

The "what if there's a broken proxy" argument in the post is specious, since a
proxy that's broken in this way will cause much more serious security problems
than are being discussed here (e.g. John visits google.com and gets Jane's
cached Google homepage from the proxy).

3\. Responses that should not be cached by browsers _or_ proxies should have a
"Cache-Control: no-cache" or "Cache-Control: no-store" header.

4\. If you're still concerned, wrap your JSON responses in objects yourself;
there's no need to modify the format to do it for you.

