
Why is JSON so popular? Developers want out of the syntax business - fogus
http://stereolambda.wordpress.com/2010/03/19/why-is-json-so-popular-developers-want-out-of-the-syntax-business/
======
frognibble
JSON is popular because there is a straightforward mapping from JSON to native
types in many programming languages.

~~~
stcredzero
Language designers take note: there is _tremendous_ utility in making entities
in your language isomorphic to entities expressed in previous languages.

I am actually using such isomorphisms in my current porting project. I'm
literally getting a couple of orders of magnitude more productivity out of
this over porting by hand.

Also note: less syntax makes this easier.

~~~
sketerpot
I recently wrote a program to do Threefish encryption which I deliberately
wrote to look as much like the mathematical notation in the spec as possible.
That made things _so_ much easier that it was almost ridiculous, even if it
made the resulting program very peculiar-looking.

~~~
blasdel
You'll love what Alan Kay has been working on:
<http://vpri.org/pdf/tr2007008_steps.pdf>

You'll be especially enamored of their 200-line TCP/IP implementation,
introduced on p17 and reproduced with documentation starting on p44. _It's
implemented as a grammar in their meta-language that parses the ASCII-art
diagrams in the RFCs and executes them._

------
DanielBMarkham
Observing the industry over the last couple of decades, I'm left with the
feeling that sometime in the last ten years or so we all went XML crazy. The
popularity of JSON is just the pendulum starting to swing the other way.

So count me in. I'll use JSON over XML for server-webclient data exchange
whenever I can. It's just much easier.

~~~
andrewvc
The funny thing is, we all knew we were going XML crazy while it was
happening. I can remember anti-XML bloat articles going way back. The thing
with XML was, it DID solve some problems, it just tried too hard.

The good thing about XML was it was a standard interchange format that every
language had at least a couple parsers for. That said, I'm not sure if it was
worth the annoyance and irritation of using stuff like XML Schema, DTDs, SOAP,
or XSLT.

------
anamax
As one of the earlier commenters pointed out, lisp welcomes you to the 1960s.

~~~
patio11
Sorry we were late to the party, guys, we just got so wrapped up in writing
software that people actually use.

~~~
pilif
... while the lispers where still counting the closing parens needed to
finally get their code running

SCNR

~~~
drunkpotato
This thread has been yet another fruitful contribution to the language wars.

There seems to be something inherent in us that feels the need to turn our
tools into our religion. It's not terribly productive.

~~~
pilif
no. but fun.

SCNR is an old usenet acronym and stands for "sorry. could not resist". I was
a) being sarcastic and b) just making fun for the list crowd. I totally agree
that if the parens issue is a non-issue for you and you feel comfortable and
productive writing lisp code than go ahead.

In fact, I have the utmost respect for people able to wrap their heads around
that syntax.

~~~
stcredzero
If you really think "wrapping your head around" the almost non-existent syntax
of Lisp is a reason to _respect_ someone, then I suggest you might delve a
little further in your CS/programming studies and find many more wonderful and
interesting things.

For the true rockstar coder "wrapping your head around" any reasonable syntax
should be a triviality.

~~~
ewjordan
_For the true rockstar coder "wrapping your head around" any reasonable syntax
should be a triviality._

Never had the dubious pleasure of trying to decipher another person's APL
code, eh?

Then again, you did say "reasonable"...

~~~
stcredzero
You don't have to tell me about APL syntax. For our senior year CS project, we
_implemented_ APL with unlimited precision arithmetic.

Wrapping your head around APL wins my respect. Lisp? You get a pat on the head
for that, maybe.

------
charlesju
On a similar token, I'm so happy REST won over WSDL.

~~~
Confusion
In what industry is that? It sure isn't in those of my clients :/

------
tptacek
The "XML requires you to build your own parse tree" argument isn't valid; XML
libraries are fully capable of handing you a DOM-style tree, and of allowing
you to pull things out of the tree without writing your own traversal code.

JSON just assumes messages are going to trivially fit into naive data
structures, and so provides fewer options.

~~~
jerf
I'm not sure if you are complaining about this aspect, but I would observe
that "fewer options" is actually the feature here, not the bug.

A generic XML DOM is still complicated to deal with. Even if you do the "right
thing" and use XPath, you still have to deal with XPath because you can't get
around the fact that you have an underlying representation that has at least
two dimensions (attributes vs. CDATA). That is, just as the article says, you
have more degrees of freedom in how you represent your data, and what is a
"degree of freedom" but a near synonym of "dimensionality"? You can't abstract
around dimensionality very effectively without losing fundamental capabilities
in the underlying component (in fact a staggering number of abstraction
failures in general can be shown to come from exactly this problem if you
really learn to think this way), and the complexity comes poking out in the
XPath. It's still better than groveling over the DOM yourself, but it's
probably also the absolute peak of concision that is obtainable; there will be
nothing better.

JSON is indeed simpler in that you don't really have 3 or 4 feasible choices
per attribute; {"first-name": "John", "last-name": "Smith"} is pretty much
your choice, full stop. That leaves the underlying library fundamentally, not
accidentally, simpler. This can get you into some trouble in some cases, for
instance XML is a better choice for HTML-type tagged text as the JSON for
tagged-text is just hideous (and, interestingly, reopens the dimensionality
problem as there is no one obvious solution), but many things are
fundamentally simpler than tagged-text.

If you want to pick up a defined serialization format, my gut would be to say
to default to JSON and back to XML if you _really_ need it for something...
but be aware that you may, and it's no better to try to jam JSON on top of a
fundamentally XML problem. (Besides, your JSON can carry bits of XML in it
without much pain, so "best of both worlds" is perfectly feasible.)

~~~
tptacek
Whoah wait hold on a sec. I'm not sticking up for XML. I'm just saying that
one argument isn't valid. I'd use JSON.

(Although I like it when my target web apps use XML; better tools support for
attacking them.)

~~~
jerf
Righty-o, like I said I wasn't sure. :) But I figured it was still worth
posting. I don't see much level-headed analysis of the issues. Too many devs
got burned by XML then can't help but get a little fanboy-ish over JSON, which
has distorted the dialog a bit, I think. And I like getting the idea of API
dimensionality out there.

------
TomasSedovic
He nailed it:

XML should be used for markup, JSON or YAML for structured data.

~~~
Qz
Isn't YAML a subset of JSON? or vice versa?

~~~
blasdel
YAML had a fuckton of sugar, where JSON has as little as possible, but the
biggest differentiator is that YAML has references — you can express cyclic
graphs. That's why it's a natural fit for fixtures and seed data in Rails: it
natively handles relational data.

------
lenni
I can't really understand all the XML-bashing. Of course it isn't the right
tool for any job, but I like it for its schema and validation features.

~~~
rgoddard
Mostly it is a reaction to the over use of XML and how and why in many
situations using something other then XML is beneficial. If a large swath of
people of start using JSON without thinking about you will probably start
seeing a similar reaction extolling the virtues of another format over JSON.

------
ntownsend
The argument that XML can have various different structures for storing a
person's name, while JSON provides one simple solution, doesn't fly. You could
run into something like { "Person": { "property": { "type": "first-name",
"value": "John" }, "property": { "type": "last-name", "value": "Smith" } } }

This begs the question, "Why would you do something that convoluted?" Well,
you can ask the same of the XML examples, and the answer probably boils down
to requirements (or incompetence?).

~~~
JeremyStein
Please don't write "begs the question" when you mean "raises the question".
<http://begthequestion.info/>

~~~
stanleydrew
But if we know what he meant why does this matter? Language exists to convey
meaning, and I think we all understood what he meant, so I don't see the
problem here.

~~~
loup-vaillant
The problem lies in dilution: if people recognize a new meaning in an old
phrase, it becomes more difficult to convey the old meaning. (Because you
can't use that phrase any more.)

There is also the case where you thought you conveyed the new meaning, while
it hasn't caught on yet (meaning, you made a mistake). So, better stay safe
and stick to the old meaning while we can.

~~~
vidarh
The dilution has already happened, and this cause is pretty much lost.

English is my second language. I've known the "new" meaning since I was a kid.

I've to date maybe seen the "old" meaning used a handful of times other than
in examples given by people trying to correct someone using the new meaning.

Outside of academia I'd be surprised to see it at all. I suspect it would be
confusing to more people than would recognize it.

~~~
Confusion
_The dilution has already happened, and this cause is pretty much lost._

Don't give up too easily.

 _Outside of academia I'd be surprised to see it at all._ That's because it's
originally an academical term for a logical fallacy. This is like the abuse of
'eigenvalues' by all kinds of crackpots and we should never stop fighting that
kind of language abuse. We can't keep inventing new terms, just because others
have hijacked the previous one.

------
tomkinstinch
I prefer JSON over XML for most applications, but one advantage of XML is that
its strict structure aides parsers. With XML it's easy to see if you've closed
all of your tags, etc.; it's all about the parser.

Conversions between XML and JSON can be a challenge (defining
namespaces,etc.). The Google Data API handles this very well. Check out their
nice side-by-side example: <http://code.google.com/apis/gdata/docs/json.html>

------
stanleydrew
Well JSON also lets you get around browsers' same-domain policy without having
to set up a proxy, which I think might be more important.

~~~
pilif
if you are talking about JSONp: JSON is by no means required to do that. You
could in theory easily pass a string containing XML to the callback function.

~~~
stanleydrew
Yes, good point.

------
euroclydon
There are so many cool things you can do with JSON. I just created a boolean
logical statement builder on a web page by using free form open and close
parenthesis, and and/or radio buttons. A little regex to change the
parenthesis into square brackets, then an eval(), and I can walk the whole
statement recursively.

~~~
stcredzero
eval() strikes me as a dangerous thing, security-wise.

~~~
euroclydon
How? I'm evaluating strings that are built with my javascript code or from my
server code, not arbitrary user input from other users. Yes the user is able
to enter parenthesis into a textbox, and those become part of the evaluated
string, but I regex replace out everything but the actual parenthesis.

~~~
tptacek
Every security assessor's favorite answer to a threat: "but I regex out
everything unsafe".

~~~
euroclydon
How is building an intermediate portion of a nested logical statement, using
eval() dangerous?

Here is the regex for what I allow (only '[' & ']'):

    
    
      value.replace(/[^(]/g, '').replace(/\(/g, '[')
      value.replace(/[^)]/g, '').replace(/\)/g, '],')
    

I guess the point you all are trying to make is that some javascript text
could have been maliciously inserted into the page somehow, and accidentally
get eval'd simply because eval is in use, but the page below says that one of
the only times eval should be used is to build up complex mathematical
expressions. Is there a safe way to build up such expressions? Send it to the
server, and invoke a JS engine there? The reason I made my original comment
was because I found the eval function to be so helpful in this scenario, b/c I
didn't need to use any type of syntax parsing.

[http://blogs.msdn.com/ericlippert/archive/2003/11/01/53329.a...](http://blogs.msdn.com/ericlippert/archive/2003/11/01/53329.aspx)

~~~
tptacek
If you are confident about charsets and you _whitelist_ down to known-good
characters ([A-Za-z0-9_ \t]) I have nothing snarky to say about the design.
Otherwise, try reading this very short thread:

[http://www.webappsec.org/lists/websecurity/archive/2010-03/m...](http://www.webappsec.org/lists/websecurity/archive/2010-03/msg00044.html)

(It's not exactly your problem but you'll get the flavor.)

------
Luyt
Isn't it dangerous to eval()? Suppose someone put some malicious code in the
JSON data?

~~~
wlievens
jQuery includes a proper parser

~~~
Hexstream
I think it was even being incorporated in browsers natively for maximum
performance. It might be available already.

~~~
enomar
<http://ejohn.org/blog/the-state-of-json/>

------
njharman
It's easy, it works.

------
eplanit
The argument is always the same, and the JSON crowd always assert
"superiority" derived from sheer simplicity. XML is too hard for them, and
they're part of some bizarre quasi-political movement that trashes object-
oriented principles. Thus, 'simple' trumps validation via type, version
control (and hence interoperability over time without breakage and
maintenance), robustness.

The weakness in the Javascript "eco-system" is it's quasi (the 'quasi'
qualifier applies frequently in this eco-system) support of objects. XSD/XML
is powerful when used in an object paradigm, and object paradigms have proven
(not just via community claims) to be very effective.

The question is, when will the JS community step up to the plate (i.e.
mature)? So much energy is wasted now on making JSON work -- just in order to
make Javascript easy. Wrong priorities.

