Hacker News new | past | comments | ask | show | jobs | submit login
How to Learn Unix Tools (nindalf.com)
169 points by nindalf on April 26, 2021 | hide | past | favorite | 86 comments



The best way is being restricted to dumb terminals for compiling and running university assigments.

I don't know how younger people force themselves to learn to use the command line. I had a windows 98 at home at the time and the university enviroment seemed primitive in comparison, but I had no choice.

And were lucky, we had X-terminals!


For even faster learning make sure the remote machine is far enough away that network lag is a factor. It’s amazing how fast the student will learn I/O redirection, tailing files, and grepping for needles, and of course automating the entire build / test lifecycle.


I first learned not only the Unix CLI but Vim under these conditions.

I typed the HTML for my student website by hand in a minimal and ancient Vim build, because student websites were hosted on some kind of Unix mainframe.

It probably took me 2 hours to write 5 sentences! I was so proud of myself that I added a footer to the effect of:

> This page was typed painstakingly in Vim. "What doesn't kill you makes you stronger."

I'm sure I would have eventually forced myself to learn Vim anyway if I never had access to this Unix system, but it was definitely important and formative to start off this way.


When I was at uni and I logged in from home with my 1200 baud modem, I used vi's "open mode" (no, vim doesn't have it), which is visual mode, but restricted to one line at a time. I even had a shell alias for "ex +o $*" to start it. It was quite fun to use, but nowadays it's not terribly useful anymore, so I can see why vim hasn't implemented it.


Having used a 3270 terminal in an extremely saturated system, it's amazing how much your way of working changes. Typing a full page of code, checking for typos, doing a test run in your head, then finally pressing Enter and sipping your coffee while the system took 1-2 minutes to process the screen.

As soon as it was upgraded, we reverted to the normal workflow of pressing Enter first, the correcting typos, paginating instead of searching, debugging live instead of in our heads, etc. I don't think we were more of less productive either way, they're just different ways of working.


And compression


<yorkshire_accent>Luxury!!</yorkshire_accent>

When I were a lad we had a teletype terminal in Pollock Halls (Edinburgh) to complete our assignments (well there were some VT100s but with deadlines coming up you had to take what was free!).


Over at Heriot-Watt we had SGIs and Windows boxes aplenty and I still found myself working in a little office on a VT100 hooked up to a Sun 3 (which ran a forgotten line printer). At first just because there was never any demand for the room, then later because it was just more fun.


Teletype ! Luxury!

Why when I was a lad had to get up 10 minutes before we went to sleep, mill our own paper AND punch the cards, and if you left an instruction out it was no supper for a week.


Yeah, it helps to have better equipment. My initiation consisted of getting stuck in emacs on a VT220 connected to a MIPS DECstation.


I don't remember if it was a VT220 or VT320 that did it but I had to use one in undergrad computer lab (also connected to some kind of DEC running ultrix) where whenever emacs did a screen redraw (for example if you did a page down with C-v) it would go into I-search mode because something in the terminfo had a C-s to freeze the screen (before some more commands and then a C-q to un-freeze), and this was visible to emacs.

Anyway what I'm trying to say is just as I'm glad I didn't have to use punch cards, I'm glad that the youth don't have to deal with that kind of nonsense.


The default options for shadow-IT are VBA and Powershell, since you don't need to ask permission or install anything to make a mvp.

I guess there will be many Powershell users in future for that reason alone.


I've worked with a client that blocks cmd, but not Powershell or VBA. God, I hate security-by-explicit-requirement.


I have a few dumb terminals connected to my machine. I wanted to get an X-terminal but I’m holding off because I’m worried about the incompatibility between modern X-clients with whatever extensions they may use and the old X-servers typically running on the terminals.


You're most likely to have problems with video playback. Most other things should work quite well.


That sounds good, I had a hunch video playback wouldn’t work. My only other hesitation is that the particular ones only connect over 10Base5, which would involve some extra networking work and SLIP, which I’m not sure is even support on modern Linux given PPP seems to be the go to for that sort of thing now.


Does it actually have onboard 10Base5 or an AUI connector? If the latter, there are millions, or at least dozens, of 10BaseT to AUI transceivers sitting in people's junk drawers. EBay has a bunch at $10 and $15 price points.

Then it's just plugging it into a switch port.


This post definitely contains some good general advice, to me the most important step is really simple:

Just use them. And keep on using them.

Resources like those suggested are good, but each will suit people differently - the one thing that all people who are expert at these tools have in common is that they've all used them a lot!


> Just use them. And keep on using them.

The last bit is very crucial. Be prepared to relearn it after a prolonged break. So taking notes may be of some practical help.

I wish *NIX land had some uniform convention for options naming and a targeted help system to go with it, so one could lookup an option directly (preferrably in shortened form, no need to spell out the whole word). OpenVMS HELP was a good example of user-friendly command-line help system, well, but that's another story...


And, to be fair, give yourself a safe enough space to experiment.

Use Docker, libvirt, Digital Ocean, multipass, Virtualbox, or whatever you want, but find a place where you can use the stuff without worrying too much about the consequences :)


Still, after almost 40 years, Kernighan and Pike's book is a great reference

https://en.wikipedia.org/wiki/The_Unix_Programming_Environme...

It may be a bit dated in some inconsequential details, but it is an extremely refreshing and fun read.


the yacc and lex tutorial to build hoc is also excellent


In terms of getting started with the command line and getting an understanding of what you are doing (rather than a handful of “memories these commands” as some beginner tutorials sadly end up being) I’ve found “Learn enough command line to be dangerous” to be absolutely amazing.

It’s free, even though the webpage makes it look you might need to buy a subscription (edit: you need to pay beyond chapter 2). It takes half a day to a whole workday to work through, but for me it was a great investment of time.

https://www.learnenough.com/command-line-tutorial


The tutorial looks neat. It asks me to start a $5 subscription to get past Chapter 2 though.


Looks like they also offer the book for $3 “la la carte” (without a subscription) which sounds reasonable.


My bad. It used to be free and I just clicked through to the second chapter and that convinced me it was still the case.


At the bottom it says continue reading and it took me to chapter 2 and from there to 3,

https://www.learnenough.com/command-line-tutorial/manipulati...


Chapter 3 is locked - at least for me.

Though 3$ is probably a small price to pay for standalone version.


For users, The UNIX Programming Environment by Kernighan and Pike is the book that got me going. There are a few things in there that don't exist now but it is the most concise easy-to-read book for new *nix users.

For administration the UNIX System Administration Handbook by Nemeth, Snyder, Seebass, and Hein is how I got started. Newer editions of this book cover Linux specifically.


The UNIX System Administration Handbook was the bible when I got started out. A tremendous book with a sizable legacy.


1. read the manual pages in book form.

2. play a man page quiz with a friend: - read a random command and ask for the description - read a random description and ask which command it is for.

I've done both as an undergraduate. The HP-UX manual set remain on the shelves right next to me. Over the years, I found letting candidates compose a pipeline of commands to solve a problem is a great way to make interviews more fun ("How would you solve this problem without writing a program, by combining existing ones?").


That's useful, but as someone with an unreliable memory having to recall specific commands under pressure doesn't sound too fun :)


I don't think absolute precision regarding specific commands or flags therein would be required because when you're in the chair and dealing with a problem you're going to have access to the resources you need to get that precision (man pages, vendor documentation etc).

Though being able to explain the mental model of how you want to go about manipulating the output, extraction of the information or data you need, be it through json manipulation, regex, line/column parsing etc is valuable.


Exactly, the man pages are there so I don't have to remember every command perfectly.


I did the "man pages in book form" when studying for my LPIC. Also flashcards. Both approaches worked excellently :)


> Most of these tools come with a --help or -h switch, that tells you the options available, with a short explanation of what each does.

Please don't do this by default unless you're sure such a command supports that flag otherwise you're literally just running unknown commands with unknown consequences.

The first port of call should always be `man` and not `--help` like the article suggests. Aside from that, the article looks good.


If `--help` does anything besides display usage info or fail, the author of the tool is to blame.

Many applications that are installed outside the scope of a package manager, especially tools distributed as a single binary, lack a manpage.


> If `--help` does anything besides display usage info or fail, the author of the tool is to blame.

No they're not. Not every command line tool uses flags, some use ordered parameters. So parameter 1 might be a file name to write to rather than a flag. Just look at how messy it is working with tools like `cp` and the `--` flag to get an idea of the problems some programs face differentiating between files and flags.

Other tools might not take any command line parameters at all. So it's entirely forgiveable for them not to check if any parameters were supplied.

Then there is the issue that some programs accept `-h` and others `-?` as a shorthand. Some programs have `-h` as short for `--host`.

Plus even if I were to agree with your point, which I don't, it's entirely moot because it's not the author of the tool that is the one who has to fix any issues that might happen to a host system due to someone blindly running commands.

So whichever way you cut it, blindly throwing flags at executables and expecting a favourable outcome every time can be risky.

> Many applications that are installed outside the scope of a package manager, especially tools distributed as a single binary, lack a manpage.

That point wasn't forgotten by me. But if you've downloaded an application outside the scope of the package manager then you should at least have some idea how it functions and thus know whether it's safe to run `--help`. If not, you probably know where to look online to find that information out.

Like I said in my OP, I've got nothing against people using `--help` et al. I use them myself. But it shouldn't be the first thing you recommend UNIX newbies to do when they're figuring out a command. It's usually something you do when you're already familiar with what a command does but need a nudge as to what flags you need to supply.


What is a program where `--help` is dangerous?


For starters, anything that’s a shell script. Maybe add Perl to that equation too (I’ve seen all too many Perl scripts that didn’t properly check parameters).

I’ve also seen plenty of enterprise UNIX era software not follow GNUisms (which ‘-—help’ technically is). It’s all good and well if your just running GNU/Linux with modern DevOps tooling but don’t expect ‘-—help’ to work on every Solaris or BSD application. Not to mention relics like Informix and it’s tools.

Also anything hacked together in house should be treated with caution too. Open those files in $EDITOR and/or examine the README (if you’re lucky enough to have one) first because there’s no telling how lazy your predecessor might have been.

So yeah, over my career I’ve ran into numerous instances where ‘—-help’ might have had unforeseen consequences.


This is my reasoning.

There are almost no man pages at my job and the software is very old so good luck Googling. I practically default to --help here. I figure if --help somehow screws something up then they deserve it.


For example, the Postgres interactive client, psql:

   -?, —-help -> help info
   -h -> set the host you are connecting


> I've never read a man page from start to finish, and I'm not sure you're meant to.

When I was 16 and just starting to get into Linux, I got a tiny little paperback, "reference to Unix programs" or something. It was literally just a list of every single Unix program and a shortened version of its man page. I read it cover to cover. After that, if I needed to do something on the command line, I knew there was a program to do it and what its capabilities were. After a couple weeks I didn't need the reference anymore.

I can't remember the book anymore, but this is basically what it was like: https://dspinellis.github.io/unix-v4man/v4man.pdf See if you can find a version newer than 1973, though... O'Reilly's Unix In A Nutshell isn't bad.

Read your man pages cover to cover. It's for the same exact reason you learn a bunch of "useless BS" in school: to save you time down the road because you'll know where to go for the solution.


Honestly I have forced my self to use a terminal and now I can’t even go without using it.

I have made my workflow very minimal such as I use firefox for browsing and terminal for pretty much everything


Try Qutebrowser [0] instead of Firefox as it has a terminal-like command line interface (while staying graphical), no mouse required.

[0] https://qutebrowser.org/


I think I have tried it. The only reason I don’t use it too much because uses a lot of ram and plus its made with python which makes it slow compared to other browser

Maybe it’s just my opinion but happy to hear your thoughts on this.

I use vim extension on firefox btw


Personally I use Qutebrowser on the same screen next to my terminal when I code. That allows me to be mouse-free during coding. It’s a great experience. Not seeing it being slow, I feel that the Python-hackability gives me superpowers. Can’t judge the RAM consumption, my machine has sufficient RAM (40GB). For private use (banking, email) I still use Firefox though.


Wow you have 5 times more ram than mine. Anyways I have used qutebrowser it makes things slow especially when I’m watching something on YouTube this is why I sticked to firefox


Shameless plug: mann helps your remember command line args.

https://github.com/soheilpro/mann


This is awesome and so simple that I can't believe I haven't done it for myself already. It would be really useful to have your personal mann page displayed as the header or footer of the official man page.


how I learned UNIX tools:

(on a NetBSD,) alphabetically went through /bin, /usr/bin, /sbin, /usr/sbin, /usr/games and read each of the program's man pages. Yeah, took a while. Also read POSIX.2

The things you do with your free time ...


Sucks to not learn about zsh until the very end though ;)


Haha! I understand the sentiment :)

But it was ok, NetBSD doesn't have zsh installed by default (and neither bash, only csh, sh and (pd? m?)ksh). When I did that, I was primed to use tcsh at $WORK. Switched to ksh later (and remained on there for a loooong time).

Anyways. I failed to make an actual point. My point was the following: The BSD manpages are _excellent_. IMO way better than the linux/GNU manpages (or info pages) of comparable utilities. It shows, again IMO, that the BSDs are a complete, single-tree distribution where documentation, userland and kernel go hand in hand and are equally well groomed.

So maybe, for learning "UNIX", go get yourself a nice little {Net,Free,Open}BSD install and toy around with the elder ways.


FWIW it was just a silly joke. I was going to write xargs first, but then realized that there's a couple of letters after x. I'm not even a zsh user myself.

I honestly agree that reading through the man-pages a good idea (after you've acquired some basic knowledge already of course), and as a Linux user I'm a bit envious of many things BSD, the documentation included.


There are irc channels that can also be of help.

I used to spend my time in #sed and #awk, and try to answer any short question that came up in the channel. They came up at a good pace, I don't know if this is still the case.

It was a really good exercise, and it has an additional social aspect you will not get from reading books (which is a great resource too).


I often look at web versions of man pages because it's just plain easier to deal with when researching mega-commands like curl or openssl. The green-screen-text-only bastard in me argues vehemently against this, but the pragmatist in me doesn't care. It's just plain easier.


How so? You can navigate much more easily inside the less pager that displays manpages than inside a browser. Moreover, it is easier to copy/paste (or simply read) snippets of the manpage into your terminal, after all everything happens in the same window. Opening a web browser and finding the man page also seems much more cumbersome than typing "man". I would never have thought that it would seem easier to anybody!


I have recently published an ebook about the introduction to the Linux command line, for anyone interested: https://gumroad.com/l/moderncommandline


It's almost bizarre how cmatrix takes up 100% CPU on my computer. I suspect it will be more efficient in xterm rather than Terminal.app though.


100% CPU is a small price to pay to look like a hacker. :)


> I suspect it will be more efficient in xterm rather than Terminal.app though.

Indeed. I get 3% CPU usage on my xterm, 5% when in full screen. Still too much for a barely 6 year-old laptop, but nothing to worry about.

EDIT: if you add the "-a" option (asyncronous scroll), CPU usage falls below 1%. I wonder what Terminal.app does to be so outrageously inefficient.


I think it's a combination of a larger window, antialiased text, arguably better font, semi transparent background, terminal being 256-color (or more), and so on. By passing -a I get it down to 80% CPU. This is progress, guys!


It can't be that.

I have all of this except the transparent background. But that should be a compositor thing, independent of the window contents. That's bizarre, pushing a few thousand characters to the screen shouldn't, by far, reach 80% CPU usage. There's something in your system that's really botched up.


I tested again, it's Terminal.app doing something! I tried with iTerm and CPU usage is negligible. Wonder what it is doing. My system is otherwise very fast.


I did not know about `cmatrix`. Looks so cool. I wonder why does its CPU usage shoots beyond 40% though.


Is it choosing the CPU resource, or is it your terminal app doing so to make many individual changes per frame? Many terminal emulators are far from efficient in that sort of use case as they simply aren't optimised at all for it as it is relatively rare.

It could be a mix of both, of course.


`tldr` program seems useful to know about, thanks.


Yep, I've just installed it - looks like it will be perfect for when I have to do things that I rarely do (and just want to quickly know how to do something)


You might be interested in the Rust implementation dbrgn/tealdeer. There is also denisdoro/navi which can use both cheat.sh and tldr, and you can also write your own cheatsheets.



Plan 9 man pages are also much more manageable than their respective GNU versions.

The tools in Plan 9 are also much simpler, with fewer options (and also a few differences), but it's a very useful subset, you will rarely need anything else.


I like tldr, but I find about half of the time it doesn't cover the use case I am looking for. I do find it useful for remembering arg order and such.


Wow, no love for info?


The genius of man is that it uses any standard system pager, and so the navigation semantics (admittedly simple) are clear.

Info uses a dedicated document viewer, which, unless you use that frequently is novel and ideosycratic when used.

(This can be changed, but more painfully than with man.)

Info is a hypertext documentation system admittedly. It happens to have been invented pretty much simultaneously with another you may have heard of, the World Wide Web. A system with slightly greater user familiarity. And Free Software implementations.

Debian's 'dwww' is an interesting marriage of multiple docuentation formats. It presents manpages, info pages, Debian package information, and additional documentation, all categorised and indexed, and accessible through a web browser from your local system.

Including, as it happens, console mode browsers (lynx, w3m, elinks, links2, etc.). Which tends to amp up and standardise access to all of the above.

dwww is the one thing which makes info actually useful. Which is a shame as there really is some quite good info documentation, it's just buried under what is for most people, an impenetrable interface.


In what way is info's interface "impenetrable"? It takes two minutes to look up its keybindings, most of which happen to be mnemonic: h-help, n-next, p-previous, u-up, l-last, q-quit, space-scroll down, shift+space-scroll up, tab-next link, shift+tab-prev link. You probably wouldn't ever need anything more than these 10 keybindings.


What does info add over plain old manual pages? I honestly don't really know, since I usually forget that info even exists.

I feel like the effort the GNU project put into creating a competing standard for documentation would have been better spent improving the tooling around the standard man page mechanisms instead.


>What does info add over plain old manual pages?

I feel like info pages are more suitable for explaining the features of larger pieces of software. Can you imagine what a mess the man page for emacs would be, if it were to try to explain all its functionality? Instead the man page of emacs only gives a brief description of the software and explains the command line options and all the functionality is explained in an info document.

I think of info as an alternative to /usr/share/doc, not man pages.

The most import features of info that are missing from man:

1. pagination 2. cross-referencing 3. structure


There is some truly great info-based documentation. The GNU AWK User Guide comes to mind:

https://www.gnu.org/software/gawk/manual/gawk.html

There's alot of information in there that is not in the manpage.

(I'm painfully aware of this as I tend to rely mostly on the manpage, and learn things every time (which is rare) I read the GNU docs.)

At the same time, the info interface and navigation remain completely opaque and nonintuitive to me.


I never knew about tldr. That tip alone is phenomenal. I'm embarrassed to say I've been doing this 20 years and never seen it. Trying to pull out several common usage examples from man pages has always been my difficulty.


Try this:

man man | awk '/^[A-Z]+/ { begin = 0 } /^EXAMPLES|^NAME/ { begin = 1 } (begin == 1) { print }'


man is missing some concrete examples about how the tool could be used. tldr seems a good thing to fix that. cheat.sh is also a great alternative to tldr, I use it a lot.


I regularly google man {x} : man grep, man curl... and a couple of times a year I absentmindedly google : man find. there's no linux in the first 30 pages of results


DDG bang searches:

manpage !manpage

Debian Manpages !debman

manpages. org !mnp

MirBSD Manpages !mbsdman

Ubuntu Manpage !uman


“Trial and error is the way to go.” I hate it. And that’s why I never made it far into unix land.


> “Trial and error is the way to go.” I hate it.

Errr... isn't all of science is based on trial and error?


Unix has a manual, you know.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: