I don’t follow. Your examples are of data formats standardized in the 80’s while Unix was developed in the 60’s and 70’s. JSON even existed before 2013, but the fact that it became a standard in addition to being popular is the point I was making.
I've spent a lot of time reverse engineering long dead hardware (and forgotten protocols), which means I've also spent a lot of time reading ancient papers generated by IBM, Bell, Arpa (so many long gone DoD programs), etc. So I sometimes have a hard time distinguishing between what is common knowledge and what is something that only myself and maybe a handful of others care about...
That said, you know that every potential variation of every possible approach to accomplish what I presume your objectives are has already been exhaustively explored, documented, implemented, and finally abandoned by organizations with functionally limitless resources (aka open ended government contracts) - 50 years ago? Have you considered leveraging some of that work? I don't think many people know about the massive amount of work already done - that was just abandoned for a variety of reasons: reasons that often no longer apply, and very rarely have anything to do with the technology's utility. For example: everybody here knows about the OSI model and how sparsely filled out it is - but did you know that there is a layer set aside to do exactly what you are talking about, and that it just isn't being used? Thats right, #6, the presentation layer - specifically the virtual terminal protocol: which is where the designers wanted the object exchange and structured data to go... not in a mess of json one layer up. The VTP was outlined in several papers going back to at least '72. You could also lean on the Airforce's work for your data model, they ran that to ground pretty thoroughly with IDEF. You've also got a huge amount of free work from IBM when it comes to structured documents, and architecture that lends itself to semantic reasoning.
Anyway, my larger point is that instead of exasperating the issue of wastefully bloated software teetering on increasingly high layers of abstractions (do a stack trace and consider the insanity of it), maybe the way to really improve our circumstances has already been discovered and then lost for a time. It would be no more difficult than trying to make kornshell-json a thing.
A Survey of Terminal Protocols (1979) DOI 10.1016/0376-5075(79)90001-1
Computer Network Architectures and Protocols (1983) DOI 10.1007/978-1-4615-6698-4
The first version of UNIX may have been created in 1969, but it continued to evolve for 20ish years, and I don't think 1969 UNIX really resembled what we had in 1989. But then the UNIX world stagnated, because Linux people were obsessed with cloning System V. Sometime during the Linux era, text became "cheap" enough to be used as a data format. So perhaps I'm wrong, and JSON's time has come in the UNIX world after all.