
JSONx is an IBM standard format to represent JSON as XML - rpeden
http://publib.boulder.ibm.com/infocenter/wsdatap/v3r8m1/index.jsp?topic=/xs40/convertingbetweenjsonandjsonx05.htm
======
buddydvd
"In an XML Firewall service, JSONx can be used like other XML input."

[http://publib.boulder.ibm.com/infocenter/wsdatap/v3r8m1/inde...](http://publib.boulder.ibm.com/infocenter/wsdatap/v3r8m1/index.jsp?topic=/xs40/convertingbetweenjsonandjsonx05.htm)

"An XML firewall is a specialized device used to protect applications exposed
through XML based interfaces like WSDL and REST and scan XML traffic coming in
and out of an organization. [snipped] XML Firewall is often used to validate
XML traffic, control access to XML based resources, filter XML content and
rate limit requests to back-end applications exposed through XML based
interfaces."

<http://en.wikipedia.org/wiki/XML_firewall>

~~~
seiji
Also, <http://en.wikipedia.org/wiki/XML_appliance> and
<http://www.layer7tech.com/products/xml-accelerator>

Astounding, is it not?

~~~
jerf
It's less silly than it sounds. Many XML attacks have JSON equivalents. Do you
know how _your_ JSON parser will deal with 512KB of open square brackets in a
row?

------
famousactress
Wow. Just wow.

Probably controversial, but in a dozen or so years of programming I think XML
probably represents the most stupid and costly ideas to've gained widespread
adoption.

I honestly can't think of a single other relatively recent technology that's
responsible for nearly as many wasted lines of code, unbelievably inefficient
systems, and legions of developers who are terrified of bytes that don't
render nicely in notepad.

I _hate_ it. Just saying. (Downvotes expected).

~~~
haberman
Couldn't agree more.

The thing that frustrates me the most is that people jump to defend it because
it's what they know. It's like they're plugged into the matrix and they will
fight for it until the end.

People love to write regexes to process their bastardized XML subset because
it's "easy," without realizing that their regex is not even remotely an XML
parser and would fall over at the first sight of CDATA.

~~~
famousactress
Totally. On two totally independent occasions I've come into work to have a
coworker tell me they implemented an XML Parser the evening before. Um, right.
Great.

The thing that totally boggles my mind is that I think it really prove-ably
fails to deliver on literally every promise/benefit/design-goal that I hear
folks claim it embodies.

~~~
kamaal
I know parsing XML with regular expressions is a sin, but when you need to get
some data out a 20 GB XML immediately in the next 10 minutes. It turns out
that its better to check out the layout of the XML, use combinations of grep,
sed, awk , perl with pipes just to serve the need for the moment.

This is better than spending next 5 hours writing a Java program to do it. Its
not a permanent solution, but either way that Java program and the shell hack
are both going to do the very same.

I would rather use the shell and do it, rather than spend the next 5 hours
with Java and eclipse.

~~~
haberman
The fact that those are your two options is just a testament to how absurdly
unsuitable XML is for this purpose.

Your "combinations of grep, sed, awk, perl" will _corrupt your data_ if your
file happens to contain any CDATA, entities, unexpected formatting, processing
instructions, etc.

Don't get me wrong, sometimes a quick hack is worth the risk. But it's
certainly no reason to defend XML, which is what put you in the position where
a quick hack was your best option.

~~~
kamaal
I agree with you. Actually I'm not too much in favor of XML either. Somebody
high up in the hierarchy attends a few conferences here and there, reads the
IBM journal. All along the word 'XML' gets repeated some 1000 times.

So the impression he gets is, XML is going to add value to everything it
touches. Most developers like me have no say in that at all, we have to just
use it.

I prefer the hack to get it just done. But agree with you, none of that is a
permanent solution.

------
js2
Reminds me of the Property List evolution:

1\. "Under NeXTSTEP, property lists were designed to be human-readable and
edited by hand, serialized to ASCII in a syntax somewhat like a programming
language."

2\. "In Mac OS X 10.0, the NeXTSTEP format was deprecated, and a new XML
format was introduced."

3\. "Since XML files, however, are not the most space-efficient means of
storage, Mac OS X 10.2 introduced a new format where property list files are
stored as binary files."

~~~
haberman
Everything converges to the equivalent of protocol buffers.

Property lists -> XML -> Binary Property Lists

JSON -> JSON Schema -> Binary JSON

XML -> XML Schema -> XML Namespaces -> Binary XML

Or you can skip all that faffing around and just jump straight to protocol
buffers. :)

ps. Protocol Buffers have a text format too that looks very similar to JSON.
And you can encode protocol buffers as JSON, though I'm not sure there's good
software support for it yet. I'm working on it.

~~~
spullara
Similarly you could use Doug Cutting's Avro format that does have a JSON
representation. It is also evolvable like Protocol Buffers and works well in
untyped languages. I highly recommend it.

~~~
haberman
Avro looks like an interesting format, thanks for the pointer.

------
dscape
The objective of jsonx is: 1) a loss-less representation of json in xml so it
can be reused with xml tools. In a sense it's the reverse of jsonml.org.

jsonx is not (neither it is meant to be): \- a replacement for json \- a
format that enables you to query json in a xml database \- a standard (I would
remember if I was part of a standard comity, although I do recall playing
bullshit bingo in some of these meetings)

I think the use case for jsonx is extremely small: maybe exclusively related
to transmitting json over the wire in hardware that is specialized for xml
processing. Having worked in it I argued many times that it is not even a
queryable format, which makes it less usable even in "xml world". ( for a
queryable format example you can check isubiker/mljson@github )

If you are not trying to achieve 1) you can/should simply disregard jsonx.
That's certainly what I do: It's completely irrelevant for me and even thought
I worked on it I never used it in any of my projects. Why should I? You can't
query that crap for the heck of it.

About saying crap about IBM: That's just plain stupid. The fact that you can't
understand something simply means you can't understand it and not that it is
of no value. "all I know is that I know nothing." - Socrates, supposed to be
smart guy

~~~
kamaal
This whole thing about not speaking about something wrong ultimately leads to
right stuff being lost. Enterprise vendors like IBM are always in the hunt of
giving the industry something that forces volumes of people to use it, and
then design tools around it that make people addicted to it.

Just put a auto completion feature in a couple famous IDE's to enable this
happen, and see how many people use it without caring what, why and how of it.

Then one fine day there is situation that most of the programmers in the
market are trained to use it and if you have to run a project you to hire them
and let them to what they want. And then the technology adoption grows.

Its not OK just to remain silent. Proper criticism is needed often, it helps
things evolve in the right direction.

------
kamaal
Not only is XML shoe horned for purposes where it doesn't belong to, but also
people who know only XML tend to use it so badly that it brings bad name to
technologies associated with it.

I work for a project that has involvement in a lot of technologies like Java,
Perl, C and C++. Now the problem is somehow there is assumption that its
difficult and costly to hire good C/C++ programmers. So what is the path
taken? They tend to replace everything with Java. Now comes the actual issue,
its easy to hire programmers who know Java. They come in volumes. Just post a
hiring Ad, and thousands will land up at your doorstep next day. The actual
issue is how do you separate the good ones from the bad ones?

Once you hire such programmers, you have to deal with technology addiction.
Things like eclipse, XML etc the whole project begins to revolve around such
things. In my project nearly everything is XML, config files, DB, persistence
you name it. If its on a file its XML. This JSON in XML which is surprising
most of you folks already exists in a lot enterprise code bases. Project
managers seem to have a narrow vision, debates on issues like these terminate
with 'It works, so we just don't care'.

I'm not against IDE's. But when your programmers can't do a simple deployment
on test boxes, can't figure out even simple commands. All under the excuse of
'I only use eclipse', It begins to show up on the language community. Java in
it self is not bad. But by reducing the barrier to entry to such low levels
you are opening door for all sorts of toxins to come in.

I see Python is the next Java in making. Masses are flocking to it. As it
usually turns out to be, most of them horrible. They just contagiously corrupt
the whole community, spread the disease and once they are done destroying it
they move onto some thing new.

After all you need to do something for a Job.

------
nddrylliog
Is it April 1st again already?

------
rpeden
This comes from the docs for the DataPower XML Security Appliance. Naturally,
I was curious about what this Appliance does.

If you go up the tree a bit, click Development, then XML Firewall, then
Introduction, you'll find this tidbit:

"These appliances offer an innovative, pragmatic approach to harness the power
of SOA while simultaneously enabling you to leverage the value of your
existing application, security, and networking infrastructure investments."

Seems like a great attempt to use as many words as possible to say nothing at
all. :)

~~~
buddydvd
This point seems relevant:

"Bridges to Web 2.0 technologies with JSON filtering and validation, support
for REST verbs, and converting/bridging of REST and Web services."

<http://www-01.ibm.com/software/integration/datapower/xs40/>

------
geuis
There's a service that's being subscribed to that provides video that I can't
share, sadly. However, when I reverse engineered how our video player was
translating a uid into a link, I was horrified.

Essentially, we perform a request that returns an xml document wrapped in a
json object, wrapped in a jsonp callback, that is then parsed by a php script
on our end back into a pure json object.

This made me die a bit.

------
drawkbox
Why oh why... If it helps JSON invade the enterprise then ok, but why make a
good thing bad.

~~~
gte910h
[http://publib.boulder.ibm.com/infocenter/wsdatap/v3r8m1/inde...](http://publib.boulder.ibm.com/infocenter/wsdatap/v3r8m1/index.jsp?topic=/xs40/convertingbetweenjsonandjsonx05.htm)

To allow you to use XSLT tools apparently.

~~~
ericmoritz
XSLT and XPath is about the only reason I can see to use this. But, you know,
how hard is:

    
    
       x.person.addresses[0].city?
    

I guess XPath is easier when you probably have to do something like this:

    
    
       JSONObject json = (JSONObject) JSONSerializer.toJSON( jsonTxt )
       String city = json.getJSONObject("person").getJSONArray("addresses").getJSONObject(0).getJSONObject("city");
    

Or however you'd do it in Java.

~~~
wulczer
That's not hard, what's hard is

    
    
       //person[@alive="true"]/pets//dog[@hair="long"]/*/flea[@name]

~~~
catshirt
still, i think an xpath evaluator for json would have been the way to go here
instead

------
joeld42
wow, the brevity and parsability of xml combined with the specificity and
structure of json.

------
wulczer
Not sure what all the fuss is about. It's actually neat to have a common way
of changing JSON to XML, think of a web service that offers an API using our
beloved JSON. But wait, we also want to provide an XML API! Let's design our
own, ad-hoc, crappy mapping or... use that thing that everyone else is using.

XML is out there, has lots of tools and libraries and is actually useful for
some things. Not to mention the array of technologies around it, like XPath,
XQuery etc, that JSON is lacking (probably because it doesn't need it, because
these formats do address slightly different requirements. Gasp!).

~~~
Stormbringer
"the fuss" is because a lot of people jumped on the JSON bandwagon in order to
_get away from_ XML.

~~~
wulczer
That's fine for them. My question stands: what's wrong with someone proposing
a generic way of transforming JSON to XML?

------
Hominem
Not to go against the grain, but this could be useful. Say you already have
some sort of datastore that uses xml. Now somebody hands you JSON for some
reason, maybe a list of companies. Convert it to XML, stick it in your
datastore and use XSL to do whatever you want with it.

~~~
PaulHoule
XSLT is a scary language. It took several years for computer scientists to
prove that it's Turing complete.

------
pnathan
IBM. I know you have smart people. Why don't you let them be smart?

~~~
Stormbringer
IBM has been divesting themselves of good programmers for a while now,
probably at least a decade.

Their interns are steered away from programming, towards career paths that are
perceived as adding value: e.g. consulting.

I suppose in theory this makes sense: have the smart locals figure out what is
needed, write up some specs and then get the guys in Bangalore to bang (sic)
it out for you.

In practice, it is an absolute dogs bollocks.

(1) The local IBMers have forgotten how to do anything in the real world. No,
seriously. They can't even install a server to use for their own use as a test
machine in less than 4-6 weeks.

(2) The Bangalore IBMers cannot code for shit. They are the second worst
programmers I ever worked with (and the worst was a consultant Thoughtworker
who was _actively sabotaging_ the project). The ones who did the least got
promoted. _twitch_

(3) They can't even maintain the software they wrote 5-10 years ago, they just
don't make any effort to retain that knowledge.

(4) It might be different for US IBMers, but outside of the US the programmers
got the shaft big time when the GFC hit. Stupid little things like (even if
your business unit was profitable) you had to bring your own coffee and milk
in to work because they stopped supplying them free.

Other than the above, IBM is just like any other large organization.

~~~
derleth
> They can't even maintain the software they wrote 5-10 years ago, they just
> don't make any effort to retain that knowledge.

IBM is the company that keeps a nearly-five-decade-old architecture and ISA
around for its business customers, correct? (zSeries (or whatever its name is
this week) is derived from s390, is derived from s370, is derived from s360,
was created in the early 1960s.) z/OS and z/VM are modern versions of MVS and
CP-40, which date to 1974 and 1967, respectively.

It seems odd that the company that does that would have problems with decade-
old software in any of its business units.

~~~
rbanffy
It seems Stormbringer was looking at the IT consulting division. I have to
agree with him/her on that - I hear horrible things about them and I'd never,
ever, under no circumstances, hire them.

That said, I have enormous respect for their mainframe division and operating
system programmers. Their server hardware (z, i, and x), z/OS/, z/VM, AIX and
their contributions to the Linux kernel are all very good.

IBM is certainly big enough and old enough that different divisions can behave
like completely different companies. It's a bit unfair, in fact, to see one
division tarnishing the image of a company so full with groundbreaking
accomplishments and such a rich history as IBM. The consulting division has to
compete for huge contracts, that are managed by pointy-haired lifeforms who
can't see beyond hourly rates, with equally incompetent organizations like
EDS/HP, Accenture and Tata.

You put an evolutionary pression on the market driving it to provide low
prices. The marker responds optimizing for price and price alone.

When you have to compete in price with incompetent-but-cheap organizations,
you either become incompetent-but-cheap or get out of business.

Tough choice.

------
shawndumas
<https://twitter.com/zedshaw/status/64050059276337152>

------
sc68cal
Yo dawg, I heard you like serialization formats, so we put a serialization
format inside your serialization format.

------
davanum
JSONx has a very limited use case, A quick check will show that the
documentation points to its use in devices on the edge for
transforming/bridging/proxying <http://bit.ly/ky0q53> <http://bit.ly/iDF2Y5>

------
bsg75
Could this present any utility as a communication interface between an RDBMS
with XML support (DB2, MSSQL) and a document DB (MongoDB, CouchDB) ?

------
listrophy
I know I'm supposed to be insightful in my comments on HN, but this is... I
just can't.

Pretend I put the facepalm ASCII-art here.

------
spitfire
I've up-voted this solely for the irony. Who says IBM doesn't have a sense of
humour?

------
Todd
Why don't we yet have XMLj?

~~~
Sitwon
Because JSON supports only a subset of the features of XML.

~~~
aidenn0
Yeah, you know the subset that people actually use.

XSLT and XPath are tools that really feel like they should be useful, but for
small things, I feel like a streaming parser in <favorite scripting language>
is faster and more readable, and for large things, deserializing it all into
<favorite database> and querying it that way is better.

------
ericmoritz
I liked this better when it was called WDDX.

------
nicetryguy
JSONx: In Space

------
nerd_in_rage
I just threw up.

------
albertogh
Using the example from
[http://publib.boulder.ibm.com/infocenter/wsdatap/v3r8m1/inde...](http://publib.boulder.ibm.com/infocenter/wsdatap/v3r8m1/index.jsp?topic=/xs40/convertingbetweenjsonandjsonx05.htm)

muk:~ fiam$ echo '{ "name":"John Smith"... }'|wc -c

303

muk:~ fiam$ echo '<?xml version="1.0" encoding="UTF-8"?> <json:object> …
</json:object>' |wc -c

904

I don't really have anything else to say.

Edit: formatting

~~~
andrewcooke
you've got to love the title of the linked page: "help"

there's someone in there, trapped...

------
eddanger
Silly, but interesting. It just shows the power and simplicity of the JSON
syntax to define these universal data structures.

