Hacker News new | past | comments | ask | show | jobs | submit login

I'm not going to argue that XML hasn't been used badly and excessively in a lot of places, it really has, and using every part of it religiously will tie you in knots, fast.

But I can't help noticing that Json is gaining more and more XML-like functionality through things like schemas and JsonPath, as people slowly realise why XML had those functions they're now having to replace. I'm a long way from convinced that all the engineering effort to switch was actually beneficial.




And schema and paths have much the same issues - they are being used as tools in things like network-exchanged messages when the underlying specs and the implementations out there were not designed with that idea in mind.

You are going to have a bad time if your schema validation tries to resolve schema URL by default.

You are going to have a bad time if your JSONpath implementation supports the older "eval" mechanisms, or has unbounded memory/processing time growth from top-down traversal of the JSON.

The issue in the article was purposely avoided in JSON by virtue of JWS not having canonicalization, transforms, or partial signatures. You sign a chunk of binary data, and that binary data might be parsable as JSON.


> But I can't help noticing that Json is gaining more and more XML-like functionality through things like schemas and JsonPath, as people slowly realise why XML had those functions they're now having to replace.

I think there's an analogy here to static typing and gradual typing. XML is a massive pain in the ass to implement and JSON is often good enough. Only having to implement the features you plan on using is quite nice.


For who though?

If you're a user of whatever-data-format designing your new application, you could always use the subset you actually cared about. No-one forced you to use all the complex bits in XML.

If you're a library author - well, yes, you could implement a Json parser at first that was eval(input), then something more complex because that's a security hole, then something else again because that's not too quick, then a new library like JsonPath to get queryability, and... all your work is still less functional than the system you were trying to replace. So yes, you can possibly implement Json libraries in less code than implementing XML libraries. But unless you had a reason to implement a new XML library from scratch anyway, that isn't actually a win.


> No-one forced you to use all the complex bits in XML.

Just parsing XML alone is hugely painful, let alone implementing the rest of the stuff like XSLT, namespaces, validation, xpath, etc etc. Plus, once you've done this, you still need a natural way to map this into domain types, or you need to force people into a visitor pattern or some other awkward deserialization technique. JSON just requires a single JSONValue sum type.

XML has its place, but it'd have to be a pretty extreme case of needing rigor or a tree where I need to be able to peg arbitrary attributes onto the nodes in order to see it as attractive. Most APIs won't benefit from all of XML's features.

For instance I maintain a podcasting/rss feed library and XML (And more importantly, the way people publish invalid xml) makes me really wish they had gone with a different format in the day that was harder to fuck up.


Devs just love reinventing the wheel every few years. Maybe we should switch to CSV next just for fun.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: