
The Rise and Rise of JSON (2017) - mariuz
https://twobithistory.org/2017/09/21/the-rise-and-rise-of-json.html
======
enricozb
I wish JSON had 1) trailing commas 2) comments (I'm not sure if this is a good
idea or not but every once in a while I impulsively write comments)

~~~
abraxas
3) set types, 4) optional schemas 5) multiline strings 6) consistent support
for numeric types 7) streaming data

The litany is not short and I'm sure I've overlooked a couple.

~~~
aloisdg
You should try XML. You may like it.

~~~
KiwiJohnno
That reminds me of my favourite XML quote: "XML is a lot like violence. If it
isn't solving your problem, then you aren't using enough of it"

~~~
bfuclusion
JAXB is at the same time awesome, and completely terrifying.

------
falcolas
JSON won, IMO, because it's human readable and writable. It came at a time
when its main competitor in this space - XML - had gotten too complex.

However, JSON is hitting the same limitations and problems that XML faced, and
is following in their shoes (namespaces, schemas, x/jpath, implementation
drift across libraries, etc).

~~~
kerkeslager
Ehhhh, is it hitting those limitations? I've yet to run into a problem in
which would be solved by adding Enterprise™ silliness.

And is it following in XML's shoes? I've not had to work on or integrate with
any systems that did this Enterprise™ silliness either.

Now granted, I'm fairly separated from the world of IBM and SAP type
companies, and I'm sure there's all sorts of unholy abuse of JSON going on in
a dark server room somewhere, but the answer in most cases is just to choose
technology stacks that don't do that to you. These problems are generally
self-imposed rather than imposed by the technology.

~~~
kstrauser
> Ehhhh, is it hitting those limitations? I've yet to run into a problem in
> which would be solved by adding Enterprise™ silliness.

I think that's _why_ JSON won: it was first and foremost and engineer-driven
standard as opposed to one that incubated inside a bunch of enterprises before
being dropped on an unsuspecting world.

~~~
falcolas
SGML, XML's predecessor, was definitely engineer built (the wiki page is
interesting). It just paved the same path that JSON is now traversing. Given
that XML is roughly 34 years old (or GML, which was SGML's predecessor and is
about 60 years old), give JSON just a bit more time to get to the same level
of 'blech'.

------
zeveb
It's a real shame that both XML and JSON eclipsed the _real_ data format for
the ages: S-expressions. I hope it's only temporary, though: there really is
no good reason the prefer JSON other than everyone else using it, and if one
gets into a novel enough domain then there isn't an everyone else doing
anything to worry about yet.

S-expressions have the advantage of not containing the redundant object type
(there's no need for an object, map or dictionary when one has alists or
plists), and the even greater advantage of elegantly representing code.

Also they are easier to write parsers for.

~~~
teddyh
There was this:
[http://people.csail.mit.edu/rivest/Sexp.txt](http://people.csail.mit.edu/rivest/Sexp.txt)
but it was never even accepted as an RFC, it seems.

~~~
blackrock
Very interesting. Did anyone else look at this, and think it was super
complicated, at first glance? You’d have to be an expert in the data exchange
format itself.

I can see why Lisp style syntax is bad for human comprehension. It’s because
of the lack of an explicit delimiter, like a comma. I don’t think human eyes
are very good at using a single space character, as a delimiter.

And with this format, you’d have to agree upon the data exchange structure.
There would have to be a master key somewhere, maybe as the header. Albeit,
this format is far more efficient than XML.

But, XML probably won out because it was explicit, and allowed multiple levels
of nesting, but at the expense of overly wordy delimiter tags.

And JSON made it simpler, by forcing it to be a lightweight key-value pair.

It’s too bad something like this didn’t take root and become more popular.
This might be a very useful advanced data exchange format for some scenarios.
Although the headaches and problems associated with trying to understand, and
work with it, might far outweigh its efficiency benefits.

------
amelius
My only problem with json files is that you can't add comments to them in a
persistent way (especially if changes in the data are written by a program).

Unless of course you'd put the comments in the data itself, but that's kind of
ugly.

~~~
crooked-v
You might like JSON5 ([https://json5.org](https://json5.org)), which allows
comments and has direct support in various tooling.

~~~
ehsankia
There are many json alts, but at the end of the day unless it's supported
everywhere, it kinda defeats the point, because what makes JSON great is that
it's supported almost everywhere.

~~~
dmurray
90% of when I use JSON is when I control both sides of the interaction. E.g. a
web app where I have no expectation of third party clients, or a config file
that will only reasonably be read by one program.

JSON is normally "good enough" as a general wire interchange format, and it's
human-readable and whoever comes after me will already be familiar with it.
But if there's a time when it's not good enough and its failures are in
expressiveness, I'd totally consider JSON5 or some other alternative instead.

~~~
baddox
Even for end-to-end internal projects, as soon as you stop using a built-in
format and need to use thirdPartyEncode and thirdPartyDecode on both ends, you
might as well just choose the absolute best third party library rather than
choose something that slightly enhances the built-in.

------
redleggedfrog
It'd be nice to know what the graph would like like now, three plus years
later. I imagine it kept along the same lines. JSON is everywhere.

I suspect that a lot of the popularity is piggy-backing on the popularity of
JavaScript.

But what the heck is .CSV doing on an incline?!

~~~
sradman
> what the heck is .CSV doing on an incline?!

It is still the most common interchange format for tabular data. Tabular data
is too verbose/unreadable when represented in standard JSON. Databases and
Data Science are also on the rise; .csv continues to ride this wave.

~~~
hombre_fatal
Also, "Why use something else when we can use data.join(',') -- that should be
enough!" must also be a contributing factor to why CSV will never die.

~~~
globular-toast
Except that doesn't work. CSV requires quite complicated quoting rules. Plus
the records are separated by CRLF, not a "newline" as many seem to think.

~~~
hombre_fatal
That was the point of my failed joke.

In other words, people thinking they can just join(",") keeps them from
reaching for an actual serializer (JSON, XML, etc), and if they realized they
already have to bring in a CSV library anyways, they might consider using
another format. Comedy gold, huh? Though I'm only half-joking, I've done it
before and we've all consumed "CSV" from people who had the some presumption.
:)

------
teddyh
> _no new version of the JSON specification is ever expected to be written._

T̵h̵e̵ ̵l̵a̵t̵e̵s̵t̵ ̵v̵e̵r̵s̵i̵o̵n̵ ̵o̵f̵ ̵J̵S̵O̵N̵ ̵w̵a̵s̵
̵a̵p̵p̵a̵r̵e̵n̵t̵l̵y̵ ̵r̵e̵l̵e̵a̵s̵e̵d̵ ̵l̵e̵s̵s̵ ̵t̵h̵a̵n̵ ̵a̵ ̵y̵e̵a̵r̵
̵a̵g̵o̵.̵ [EDIT: I was mistaken: the latest standard was RFC 8259. However,
this was still published on 2017-12-13, _after_ the article was written.]
There have been three different RFCs alone, all defining JSON. (And let’s not
even get into the thing called JSON5.)

~~~
kozak
I've tried to find out what was the change less than a year ago, and I
couldn't find this information. So, what was the change?

~~~
teddyh
I seem to have misread my reference: the change a year ago was to _Javascript_
, to make _it_ conform to JSON, not the other way around, which was how I
originally interpreted it. The latest version of JSON itself seems instead to
be the latest RFC from 2017-12-13. Note: this was still _after_ the article
was written.

------
66fm472tjy7
One reason in favor of using JSON over XML for web services I don't see
mentioned often is that many XML schema based deserializers will by default
fail if they get an unexpected element whereas JSON deserializers will ignore
it, making it easier to remain backwards-compatible. Example: If you have the
following defined in your .xsd

    
    
      <complexType name="MyMethodResponse">
       <sequence>
        <element name="A" type="string" />
        <element name="B" type="string" />
       </sequence>
      </complexType>
    

and you later add a "C", existing clients will fail, so you will have to
create a new method along side the old one to remain compatible.

~~~
elcritch
Following proper XML spec always led to more problems than what they were
meant to solve. Sometimes I use XML, but just the syntax with a more html5
style to it.

Similar to defaulting to error-ing on extra tags, I never got the point of
making everything an element and adding extra redundant attributes. That's
really just the result of XML that's automatically generated.

Like the `<element... type="string">` in your example of classic XML, why not
just use `<a>3</a>` or `<a int="3"/>`. XML proper styles still suggest the
classic long/autogenerated form. Writing XML in html5 style like `<my-a data-
value="3" />` makes it much friendlier. Or as I tend to use it for various
internal protocols to wrap multiple CSV data in a file:

    
    
        <some-data type"csv" columns="A, B, C">
          1.3, 3.4, 5.6
          1.3, 3.4, 5.6
        </some-data>
    

Technically That's really just html5 in a file I guess. :-) But html parsers
also don't tend to complain about extra tags too.

------
reggieband
I remember working with SOAP and WSDL. One advantage I kind of miss from that
era was the static generators that would create your service classes and the
strongly typed DTOs based on the WSDL. I understand some effort has been made
to replicate this (e.g. json-schema) but it isn't nearly as widely used as
WSDL seemed to be. To be honest, I only kind of miss it since XML was such a
major pain to deal with (any one ever had to deal with XPath or XSLT? What a
nightmare those were).

I recently used GraphQL on a project and it had some nice advantages. I love
the idea of protocol buffers but have never had the chance to use them in
anger. But if I'm honest, the boring option is JSON and it is what I would use
for just about any API I had to expose nowadays.

------
jsnell
(2017)

Previous discussion:
[https://news.ycombinator.com/item?id=17832936](https://news.ycombinator.com/item?id=17832936)

------
k__
I remember 2010, when I build my first API. People laughed at me for using
JSON.

I had to add XML and CSV, because "nobody would integrate with a JSON API"

~~~
chc
I think whatever company you were working at was a few years behind the curve,
because JSON was already pretty much the standard for SPAs by 2010. For
example, that's the year AngularJS came out, with JSON/JSONP as the only
built-in serialization format for communicating with backend APIs.

~~~
k__
Pretty much, yes.

In 2011 we moved from an PHP app to an SPA where JSON came in handy.

------
thelazydogsback
For small, heterogenous structs like messages and config it makes sense -- for
repeated/homogenous collections (array of similar struct - a common case) both
JSON and XML are wasteful -- a reflective/inline schema combined with a table
would be more readable and space efficient. Also, meta-data needs to be either
embedded or provided by special keys, so general meta-data tagging support
would be great, where a non-meta-data-aware reader would only get the target
data, but any JSON datum could be tagged with another JSON datum as meta-data.
(The meta-data tags could be implement the inline-schema for the table as
well.)

------
Zamicol
JSON is one of the best things to ever happen to software development.

~~~
abraxas
It really isn't. I'm almost tempted to say it's the opposite though that would
be overstating the case.

It's a format that still bears excessive decoration (what's the purpose of
quotes around field names? what are all those commas for?) yet it's limited in
the types of data structures that it's able to express (natively). I'm not
particularly fond of Clojure specifically but a format like EDN would have
been superior in just about every way.

~~~
hackcasual
I'd hardly call a few bytes per key and field excessive. Especially compared
to something like XML.

The datastructure complexity being limited is also a pretty significant key to
its success. More complex datatypes means greater chances for JSON handling
libraries to lack compatibility.

The only substantial shortcomings of JSON I see are shortcomings associated
with any textual serialization format. Optimizing for human readability in a
use case that's 99.99% of the time not read by a human.

~~~
cbsmith
JSON looks good when compared to XML.

That's literally how low you have to go.

"The only substantial shortcomings of JSON I see are shortcomings associated
with any textual serialization format. Optimizing for human readability in a
use case that's not 99.99% of the time not read by a human."

There are a few other shortcomings that are reasonably
substantial/significant, but yeah, that's the gist of the problem.

~~~
msla
> JSON looks good when compared to XML.

JSON looks good compared to what we'd be using instead of JSON, which is
nothing so nice and structured as XML. The competition to JSON is something
infinitely more ad-hoc, probably without a distinct parser, such that a
generic library to generate or consume it is impossible, and getting usable
error messages is equally impossible.

> "The only substantial shortcomings of JSON I see are shortcomings associated
> with any textual serialization format. Optimizing for human readability in a
> use case that's not 99.99% of the time not read by a human."

I agree with this and disagree at the same time: Optimizing for human
readability means optimizing for the weird case, the 0.01% (but it seems to be
more often than that) of the time you need to go beyond the tools you have to
fix something. Saying that's rare is true but inapt: Seatbelts are only used
in rare cases, too.

~~~
nitrogen
If we didn't have JSON we'd have settled on something like MsgPack.

------
akubera
I still think it's a shame that Amazon's Ion format never got widespread
adoption.

[https://amzn.github.io/ion-docs/](https://amzn.github.io/ion-docs/)

Superset of JSON with many extra features that people in the comments here
desire from their data-language, such as support for comments, timestamps and
s-expressions.

------
kanobo
Great article. It's strange for me to be old enough to have experienced JSON's
entire history as a website maker - I didn't learn anything new in the article
I didn't know before and never gave it much thought - but to read it all
compiled in one place makes me want to appreciate that we are experiencing
tomorrow's history right now.

------
olejorgenb
A annoying problem with json is the lack of full floating point number
support. Transferring NaN, +Inf/-Inf is a pain.

~~~
hajile
Crockford believes the One True Number should be decimal instead of floating
point. He also believes that comments will be abused and turned into ad-hoc
parser directives.

So everyone gets to suffer for his beliefs.

------
enriquto
In 10 years we will see JSON with the same mix of mockery and regret as we see
XML today.

~~~
wvenable
I remember using XML-RPC back in the day to communicate between a desktop
client and a web service. It was fantastic. And then designer went off with a
committee somewhere and produced SOAP. The regret with XML is that it
continued to evolve into a monstrosity.

Thankfully JSON came along and got back to the simplicity of early XML and
XML-RPC and stayed there.

~~~
user5994461
I remember killing some old XML-RPC applications.

Deserialization and remote code execution vulnerabilities all over the place.
That was brutal.

Who thought this was a good idea to pass arbitrary function names and
arguments for the remote servers to resolve and execute blindly? The regular
vulnerabilities in the XML parsing libraries themselves was the nice cherry on
top.

~~~
wvenable
> for the remote servers to resolve and execute blindly

I'm not sure who would would to that but I certainly didn't. Ultimately XML-
RPC is no different from REST/JSON except it's in a different format. What you
did with that format is a totally different issue.

~~~
user5994461
Just take the first example from wikipedia:

    
    
        <?xml version="1.0"?>
        <methodCall>
          <methodName>examples.getStateName</methodName>
          <params>
            <param>
              <value><i4>40</i4></value>
            </param>
          </params>
        </methodCall>
    

The thing is meant to call arbitrary functions with arbitrary arguments. It
doesn't take long until there is a straight up exec functions exposed or some
accidental command injection.

It's strange to look at it 20 years later. The adoption of JSON really got
developers to stop shipping RCE vulnerabilities every other week. Yet nobody
must have thought of that when deciding what to use.

~~~
wvenable
What do you think happens with REST and JSON? Or SOAP? This is exactly
identical to:

    
    
        {
            "methodName": example.getStateName
            "params": [14]
        }
    

Although you'd probably instead have an REST endpoint contain the method name
and the entire JSON body is the parameters. But the difference is minor.
There's no reason this allows arbitrary execution than anything else.

Methods directly exposed to the web is how 99% of all MVC frameworks work.

------
atticusCr
Although it is easier to see the benefits of using JSON as an interchange
format because it is lightweight, I still believe that XML is more elegant and
verbose than JSON. One of the complains for JSON has been the lack of schemas,
although there are some ways around that in projects such as Apache Avro
[https://avro.apache.org/](https://avro.apache.org/), registry schemas and all
that Jazz

------
doctoboggan
This is a great bit of internet history, if for nothing else than this Douglas
Crockford quote in response to a flamebait-y argument about XML being superior
to JSON:

“The good thing about reinventing the wheel is that you can get a round one.”

------
velox_io
I'm quite a big fan of json.org it outlines the JSON specification in a
handful of diagrams and a small amount of text, it's so simple and elegant.

------
The_rationalist
This is such a painful contingent historical accident that we have json as the
standard instead of hson

------
raverbashing
Amen

XML is a mess and a chore to work with (at least in any language that isn't
Java I guess, but even then).

Yes, it has some rough edges. Yes it could be better.

But overall it's good. Not too complicated and not too hard. Works fine for
most stuff.

