
Illustrated Jq Tutorial - MichaelMoser123
https://mosermichael.github.io/jq-illustrated/dir/content.html
======
tlhunter
I've been a die-hard Linux user for about a dozen years. Recently I had to do
some development with MS Powershell. I was very reluctant at first, but after
getting familiar with the technology, I almost fell in love.

"Cmdlets", basically commands used in Powershell, output "objects" instead of
the streams of text used in a more classical shell. Powershell has built-in
tools to work with these objects. For example, you can take the output from
one Cmdlet, pipe it through `SELECT` with a list of fields specified, and get
a stream of objects only containing those fields. Other operations can be
performed against those objects as well, such as filtering and whatnot.

Back to normal _nix commands, we 're starting to see more and more commands
introduce direct JSON support [1]. There are even tools to translate output
from common commands into JSON [2]. We'll probably see `jq` shipped directly
with modern distros soon. Eventually we'll reach a tipping point where it's
expected that command supports JSON output. Tools like `awk`/`sed` might get
updated to have a richer support for JSON. Finally, we'll have ubiquitous
Powershell-like capabilities on every _nix machine.

Powershell _is_ available on Linux. The model of piping objects instead of
JSON is both powerful and more efficient (For example, there's no redundant
keys like in a stream of JSON objects, leading to less moving bytes, like how
CSV headers aren't repeated with every row. Plus, binary data is smaller than
text.) But, most developers are hesitant to switch out their shell and
existing workflows for a completely new tool, which is why Powershell will
likely only be adopted by a small subset of sysadmins.

[1] [https://daniel.haxx.se/blog/2020/03/17/curl-write-out-
json/](https://daniel.haxx.se/blog/2020/03/17/curl-write-out-json/)

[2]
[https://github.com/kellyjonbrazil/jc](https://github.com/kellyjonbrazil/jc)

~~~
Freaky
Though it's pretty immature, nushell has a similar idea, with its own internal
data model being streams of structured, typed data:
[https://www.nushell.sh/](https://www.nushell.sh/)

And back to nix commands, libxo is used by a chunk of the FreeBSD base tools
to offer output in JSON, amongst other things:
[https://github.com/Juniper/libxo](https://github.com/Juniper/libxo)

    
    
        -% ps --libxo=json,pretty
        {
          "process-information": {
            "process": [
              {
                "pid": "52455",
                "terminal-name": "5 ",
                "state": "Is",
                "cpu-time": "0:00.00",
                "command": "-sh (sh)"
              },
    
        -% uptime --libxo=json,pretty
        {
          "uptime-information": {
            "time-of-day": " 8:34p.m.",
            "uptime": 1730360,
            "days": 20,
    

Be nice to see more tools converted.

~~~
gerdesj
# ip -j a | jq

------
onion2k
If you're not aware of what jq is, it's "JSON Query", a cli tool for filtering
JSON streams -
[https://stedolan.github.io/jq/manual/](https://stedolan.github.io/jq/manual/)

------
gklitt
Very neat project! I think visibility into intermediate stages of a pipeline
can be enormously useful for data restructuring tasks, where the individual
steps aren't necessarily "difficult" to understand, but it can be hard to keep
track of what's going on without a good feedback loop.

Here's a demo of a prototype live programming environment I made for jq, which
similarly shows step-by-step views of the data but also gives live feedback as
you construct your pipeline:

[https://twitter.com/geoffreylitt/status/1161033775872118789](https://twitter.com/geoffreylitt/status/1161033775872118789)

By the end of that Twitter thread I ultimately morphed it into a tool for
building interactive GUIs (eg, get API data in JSON, use jq to morph it into
the right shape for your UI, output to a HTML template).

------
dang
Please don't put "Show HN" on posts like this. See the rules at
[https://news.ycombinator.com/showhn.html](https://news.ycombinator.com/showhn.html).

I'm sure it's fine reading material, but if we allowed Show HN to be reading
material, every submission would be a Show HN.

------
jzelinskie
Shameless plug: if you really like jq, I built a project that uses libjq to
process various formats.

[https://github.com/jzelinskie/faq](https://github.com/jzelinskie/faq)

~~~
dastx
If you'd like to avoid cgo, you can use a pure go implementation of jq called
gojq: [https://github.com/itchyny/gojq](https://github.com/itchyny/gojq)

~~~
jzelinskie
Thanks for this link. If you'd like to create a GitHub issue, I'd appreciate
it. My justification for linking to libjq is that it's a moving target: there
are various builtins added across updates etc...

------
ilSignorCarlo
great content, it's just a bit annoying that one has to click on every portion
of the command to see the "illustrated" part

------
Kaze404
Why does the reader mode icon not show up for this page on Firefox Android? I
don't think it's possible to read on mobile atm.

~~~
terenceng2010
There is a specific requirement to trigger that. Not sure whether desktop and
Android Firefox is using the same criteria though.
([https://stackoverflow.com/questions/30661650/how-does-
firefo...](https://stackoverflow.com/questions/30661650/how-does-firefox-
reader-view-operate))

edit: the article does not have <p> tag. so reader view is not triggered.

------
kylepdavis
Nice overview of jq! You may want to say demonstration rather than
illustration though.

I liked jq but liked json, a similar npm package, a little bit better for
simple tasks.

You can find more about it here:
[https://github.com/trentm/json](https://github.com/trentm/json)

As a JS dev I tend to have node installed anyhow so I just use a shell alias
to wrap ‘node -pe’ these days. It’s not really for shell scripts but it’s
great for quick every day usage. Plus you can use JS if needed instead of
their DSL.

Here the code for the alias in my shell profile:
[https://github.com/KylePDavis/dotfiles/blob/master/.profile#...](https://github.com/KylePDavis/dotfiles/blob/master/.profile#L90)

------
soheilpro
Shameless plug: If you use jq a lot, catj
([https://github.com/soheilpro/catj](https://github.com/soheilpro/catj)) can
really help you with writing query expressions.

~~~
oftenwrong
I use gron to do this same thing:

[https://github.com/tomnomnom/gron](https://github.com/tomnomnom/gron)

Some options of gron I use often:

\--stream, which treats the input as "JSON lines" format

\--ungron, which converts from the flat format back to JSON

------
hugg
Is there something similar for YAML? I've tried `yq` briefly but weirdly
enough it doesn't seem to accept standard input in the way that jq does (ie
pipe in some json, and output some pretty json)

~~~
MichaelMoser123
it shouldn't be too difficult to convert between yaml and json, funny i
couldn't find a light weight converter easily. I think i will try to write
one.

~~~
mason55
Basic YAML, sure, but YAML has some insane features that I don't think would
be possible to replicate in JSON. Or, at least, you'd lose some information in
the conversion.

For example, YAML supports the concept of reusable fragments. You can define a
fragment in one place and reference it further down in your YAML file. A JSON
converter could take the final YAML output and turn it into JSON but you would
lose the context of the fact that in the original YAML it was an included
fragment and not just the same section repeated a few times.

~~~
loeg
Yeah, YAML can also embed arbitrary code in the original Ruby incarnation and
that clearly cannot be translated to JSON.

Yeah, no straightforward translation of references to JSON. You could provide
both translation ramps, but it would be an implementation-specific convention
and not something other JSON tools understood.

------
toisanji
the tutorial could be dramatically improved by showing some json data and then
the results of processing.

~~~
choward
If you click on the query string in the command it does. Not intuitive at all
though.

------
wonginator1221
Nice! I struggled to learn jq initially and I made a similar page for my team.

One suggestion is to use with_entries as a replacement for the 'to_entries |
map(...) | from_entries' pattern. For example:

    
    
      jq '.metadata.annotations | with_entries(select(.key == "label1"))'
    

is equivalent to

    
    
      jq '.metadata.annotations | to_entries | map(select(.key == "label1")) | from_entries'

~~~
philsnow
Not related to with_entries, but I didn't see anywhere else in this thread
that mentioned dealing with awscli output

from_entries handles nicely the Tags in a lot of awscli output, you can do
things like

    
    
        aws ec2 describe-instances | \
            jq '.Reservations[].Instances[] | 
                {Role: .Tags | from_entries | .role,
                 Name: .Tags | from_entries | .name,
                 Id: .InstanceId}' \
            -C -c | sort | less -R
    

to get a summary of all your instances sorted by role.

------
alexellisuk
I have been using jq for years and still can't get it to work quite how I
would expect it to. kubectl's jsonpath seems just about workable.

~~~
jcims
I don't know what the term would be, mental model, but I just can't get jq to
click. Mostly because i only need it every once in a while. It's frustrating
for me because it seems quite powerful.

~~~
loeg
I tend to only use it for relatively simpler queries than it is capable of.
Ditto regular expressions. But I do not find myself missing advanced features
of either very often. If I run out of my ability with jq, the manual page and
playing around with the query sometimes illuminates the correct answer. And if
not, it's often time (for me) to switch to using a less ephemeral program
anyway.

------
fokinsean
I like how you have output for each process of the pipeline, however it would
be much better in terms of usability if you could dynamically load the result
just below the query rather than opening a new page.

With that said this is a great overview!

~~~
MichaelMoser123
I added another version where all the links are part of the same page / inline
div's that are displayed [https://mosermichael.github.io/jq-illustrated/dir-
single-fil...](https://mosermichael.github.io/jq-illustrated/dir-single-
file/content.html)

------
enriquto
Would like to have a kind of "rosetta stone" where each of these examples is
rewritten by passing the json to "gron" and then using the standard unix
tools.

I guess some of the examples would be simpler than the jq solution.

~~~
loeg
Kudos for sharing "gron," I hadn't heard of that tool before and it looks
quite useful:
[https://github.com/tomnomnom/gron](https://github.com/tomnomnom/gron)

~~~
enriquto
For a truly unix experience, filter the output of gron through this

    
    
        grep -Ev '({}|\[\])' | tr -d \; | cut -c 6-
    

It will remove useless cruft that is added for the output to be valid
javascript

------
fs111
apart from all the useless use of cat, great content!

~~~
joppy
God forbid that someone should ever write (- y + x) rather than (x - y), what
a useless use of the plus sign!

Why have a problem with this? Catting a single file is a well-known idiom for
outputting its contents into a stream, plays well with positioning in
pipelines, and has the nice property that you can erase as much of the
pipeline as you want, to be able to peek inside it at any point.

~~~
cormacrelf
Completely agree. Cut it if your file is 10GB of logs because it requires an
extra stream copy, but otherwise, it is the best way to interactively drill
down into data and write the pipeline as you go. It is almost never a valid
criticism, and when it is, calling it Useless UOC (reads as an insult to the
author!) was the worst way to communicate that to the masses.

------
majkinetor
This is so much easier with Powershell. V7 is totally cross-platform and I
cant see why people have a problem to use it, if nothing else then for
`ConvertFrom/To-Json/CSV/Whatever` cmdlets ..

~~~
dotancohen
I'd love to see an example. Let's say I'm on a Debian server. How would I
acquire Powershell (is it GPL/MIT?) and use it to convert some JSON?

~~~
brianjlogan
Powershell Core and .NET core are MIT Licensed as of 2016
[https://github.com/PowerShell/PowerShell/blob/master/LICENSE...](https://github.com/PowerShell/PowerShell/blob/master/LICENSE.txt)

Debian install instructions [https://docs.microsoft.com/en-
us/powershell/scripting/instal...](https://docs.microsoft.com/en-
us/powershell/scripting/install/installing-powershell-core-on-
linux?view=powershell-7#debian-8)

Enable https, add feed, apt install

As someone who started out scripting in Powershell but preferred Linux as an
OS. I found myself missing powershell's object passing in opposition to bash's
string passing.

That being said I now primarily use iPython for advanced shell tasks as I can
leverage all of Python's libraries like JSON or YAML.

~~~
nailer
> That being said I now primarily use iPython for advanced shell tasks as I
> can leverage all of Python's libraries like JSON or YAML.

TBH they're both excellent choices. nushell looks really good too. The point
is not to scrape text - rather than write a bash script that can handle JSON
properly, write a pwsh/python/nushell script that handle everything properly
(ie, by selecting fields rather than scraping text).

~~~
majkinetor
Yeah, lets wait 10 years for nushell to catch up.

