Hacker News new | past | comments | ask | show | jobs | submit login
Can you use the terminal for everything? [video] (youtube.com)
156 points by bane on Sept 1, 2018 | hide | past | favorite | 92 comments


The guy is absolutely right. Mutt is not a gimmick like ascii video on terminal.

It actually makes a lot of sense as an email client and is surprisingly ergonomic. I also enjoy the fact that my config is a normal dotfile that I can version control etc. Also, your passwords can be encrypted using the tools that you choose, as opposed to whatever home grown crypto (if any) the [insert random Electron email client] uses. Some ways are outlined here:


Oh, man, I still miss mutt. I should see about getting it set up again. I bet someone has a good config online somewhere for using it as an IMAP client with Gmail...

The only thing that might throw people off is that it uses nano for creating your email... (from the video)

Actually, IIRC it uses the EDITOR environment variable, with some sane fallbacks. I happily wrote emails in vim for years. That said, defaulting to nano with a quick and well known way to swap your buffer to vim (and possibly back) might be a better solution overall. I love vim and use it as my IDE, but the cost/benefit ratio isn't in it's favor when your average time spent on a reply might be very short or when you're writing freeform. Especially if your default config is for wordwrap off because you use it as an IDE...

The other thing to make sure you get nailed down in Mutt is to set up links or elinks as your html view handler, so HTML-only emails can actually be read without extreme difficulty for those cases where a client sent an HTML email without a plain-text companion.

The thing to remember for any program that allows complex configuration like mutt is that you can often search for other people's configs[1] and copy them as your own, and once you get better, you can copy portions of them and tweak as needed to get your own specific customized behavior. I ran FVWM2 as my window manager for over a decade, and I remember every 5-6 months I would do some searching to find interesting new behavior people had implemented that I could tweak and steal (like here[2]). Every time some interesting new UI feature would come out somewhere someone would implement it in FVWM within a couple months.

1: https://gitlab.com/muttmua/mutt/wikis/ConfigList

2: https://www.box-look.org/browse/cat/143/

When I set up my personal gmail with Mutt, I used the following guides, if you're interested. None of them are perfect, so I did what most people do and took bits and bobs from all of them:

1. [https://smalldata.tech/blog/2016/09/10/gmail-with-mutt]

2. [http://stevelosh.com/blog/2012/10/the-homely-mutt/]

3. [https://baptiste-wicht.com/posts/2014/07/a-mutt-journey-my-m...]

"The guy is absolutely right. Mutt is not a gimmick like ascii video on terminal."

I personally use (al)pine and I could not agree more.

But here is a different, and I hope very provocative, reason that you should consider a terminal email client:

Local mail delivery does not traverse the Internet, or any network.

So, for instance, when I send an email to one of my engineers at rsync.net, even though it is sent in plaintext, it is a private communication and cannot be intercepted by either an ISP or some global observer. It's nothing but a local file operation between two mailspools...

Further, it would be extremely difficult to phish someone who is viewing their email in the terminal. The effort that it takes to view and copy a URL is greater than the discernment required to identify it as suspicious - so you'll catch them every time.

Is Alpine still getting security updates? UW doesn't maintain it anymore, like at all. It exists as git repo out in the ether [1]

I'd probably run it in a non-privileged container.

Infact, I should be doing that for Mutt ...

[1] http://repo.or.cz/alpine.git

"I'd probably run it in a non-privileged container."

Does mutt talk to the network or have a daemon or run as a service ? I don't know the answer to that but I know that for my own use-case, pine is basically a text editor. It executes no programs or code and does not talk to anything but the local SMTP daemon.

How much sanity checking and bounds checking does sendmail (or postfix or whatever) do on email ? That's the question, I think, because again, in my case, the threat vector is very limited - basically an arbitrary (weird) email or piece of an email that somehow breaks pine.

So no, I don't run it in a container - in fact, I run it as the user whose mail spool is the highest sensitivity asset that user owns ... so a container wouldn't help. If you get the mail spool, you get everything - at least or that user...

When I used Alpine, it was talking to IMAP and definitely segfaulted a couple times. But an RCE in Alpine != RCE in Postfix (great architecture). I dunno MUAs are so personal, they feel like a perfect place to target and handle more edge cases than upstream processes.

The last commit to that git repo was a couple days ago, and looking at the log, it gets a new commit every week or so. It may be under different stewardship now, but it definitely seems maintained. The last stable release was last year, but development seems to be proceeding nicely. The full site is at http://alpine.x10host.com/alpine/.

> But here is a different, and I hope very provocative, reason that you should consider a terminal email client:

> Local mail delivery does not traverse the Internet, or any network.

The implication here being that you are running the clients on the email server. Wouldn't you effectively get the same security by tunneling IMAP over SSH?

> The implication here being that you are running the clients on the email server.

That's a common scenario, but most linux/BSD distros ship with some mail server software installed and configured to run out of the box, even if also set to not allow external connections, as that's how the system delivers messages to users (e.g. cron output, etc). So almost every linux/BSD is a mail server, even if it may not be the mail server. If there's a main box that people in your company connect to as a terminal, email to local accounts on that box is essentially a internal messaging system. Hell, you could throw an IRC server on there with no outside access and use terminal IRC clients to have your own private chat channel as well. Who needs slack?

Unix have all kinds of funny things if all happens within a single box.

If you want to send someone a message, there is write (if the person have mesg set to yes). There is also talk, but it requires its own daemon.

Seriously, _nix started out in the mainframe era. Meaning that the basic assumption was multiple terminals hooked up to a single computer, with a different user on each terminal.

Sadly we have a generation or more that grew up with single user computers, and seems to insist on turning _nix into that rather than embrace what it can offer.

Multiseat is a bit complex under X11, but certainly possible today. What do you see as the advantages of thin clients, though?

I think the point wasn't to specifically lament the lack of thin clients, but the lack of knowledge about how the systems are really designed be multi-user, and actually multi-concurrent-user, as opposed to systems like windows which eventually gained multi-user capabilities, and even then concurrent use isn't exactly the common case (and I'm not sure how well it works in practice, but I imagine with RDP it works well enough on the server products).

All of these tools and capabilities still exist; there's probably still some Unix beards hanging out on Nyx and SDF. Yet here we all are having a discussion on HN instead.

> All of these tools and capabilities still exist

Yes... that's the point of the discussion. They exist, but there is (or is perceived to be) lack of knowledge about their existence and how to make use of them from some newer users, either because of the push for desktop linux, or for whatever reason.

> Yet here we all are having a discussion on HN instead.

I have no idea what you're trying to imply here. I suspect maybe we are discussing similar, but ultimately different, things.

I'm implying that multi-user systems were supplanted by web applications, because multi-user systems did not offer good value, and do not scale. Neither do I see these systems being advanced in this thread based on their utility.

Multiseat is a dirty hack. It is about using software to build a terminal out of a random collection of screens and input devices. All it has really done is give us yet another "session" to mentally track, in the form of consolekit/logind, on top of the kernel and X provides ones.

And it is not about thin clients, thin clients btw is yet another Windows-ism, it is about the concepts embedded in the unix concept that current day devs seems to either sidestep or downplay while building house of cards in userspace that poorly replicate said concepts.

You're not really explaining what it is that you think is missing from modern computing. From my perspective, none of this technology has gone away, but some things see very little use for good reasons.


You are not entitled to speak to anyone like that, and neither wisdom nor correctness increase with volume. This is a comment which can only be dismissed. I am sure that you have much better arguments at your disposal, if you would condescend to employ them.

Any time you run the mail server and the clients connect directly to the mail server via encrypted paths (IMAP/S, SSH tunnel, VPN...) the Internet is not traversed, but the local network is.

Mutt/Neomutt is great once you got over the initial trauma of the keybinds and configuring the config plus theme. Except when it comes to HTML e-mail which is a lot more common that I'd like it to be.

For a more up-to-date guide on how to use GPG for your e-mail password [1]

[1] https://pthree.org/2012/01/07/encrypted-mutt-imap-smtp-passw...

Note that the GPG encryption itself is notoriously hard to configure unless one knows to just use gpgme backend (it should be configured by default!). Even then there is a lot of outdated code related to GPG in general (e.g. PKA, no way to locate key by email).

That might be okay still with some HTML filtering. Just coerce it to plaintext. Not sure anything of value would be lost.

What companies would let me use Mutt for email? I have a long list of ideal characteristics for an employer but have decided that a simple test might be that they let me use Mutt.

I have never worked in a job where even a single person cared about one's choice of email client.

Seems like the kinds of people who'd make/watch a video rather than read/write about the topic are not likely to be able to answer "yes".

Is there a transcript? Or is the summary "no"?

IMHO The 20+ years spent trying to make a desktop version of linux has SEVERELY reduced overall knowledge of unix practices and patterns. I'm not just talking about things like automake or the contents in /bin that younger programmers never learn, but the CLI in general. A while back I was working with an intern who as struggling with using distributed linux boxes because all he knew was Gnome: the CLI was utterly foreign to him. So I find it upsetting that this "article" would even be necessary.

It is not a question of age, most of my older colleagues as an intern used desktop apps / GUI git clients and had few UNIX experience.

> most of my older colleagues as an intern used [...] GUI git clients

How old are your "older colleagues"? I'm 35 and I remember "the land before Git", let alone the time before GUI Git clients...

My first Linux install didn't have a GUI at all, at the start. I didn't have enough floppy disks to download all of the Slackware A, N, and X packages, so I downloaded A and a subset of N onto floppies, which let me bootstrap PPP so that I could start downloading the X packages to get a working GUI going. That was on 33k6, and only when my parents didn't want to use the phone, so it took a while...

But if you had to do something now on some alien OS that you didn't know the internals of, wouldn't you prefer a GUI?

Contrary to everyone else here: hell yes, every single time.

GUIs are discoverable, and have lower friction for first time users. Example : a video codec GUI will always be easier to use than ffmpeg. I still Google how to use ffmpeg after 10years , and I have to read the entire man page because I don’t know what I don’t know! Is there a newer, better option to this flag available? Like lame for mp3s (another great example), a man page which (used to?) start off with bitrate options, but at the end says; forget all that, just use --preset. Or imagemagick, which decides to go for undiscoverability gold and split their entire program up over multiple binaries so you now get to read multiple man pages! After you guess what the names of those programs are, of course.

Hell. Give me a gui for any unfamiliar task, any time.

Of course, were I encodig movies every day, I’d be singing a very different tune.

Edit: another example: I resize images about once a week and I still have to google imagemagick’s convert. I keep forgetting how to resize to known width, keeping height ratio. It’s so frustrating that I’ve just given up and sort of guess with % instead, until it’s close enough. Not to mention resizing to a desired file size (is that even possible? With a gui, at least I’d know if it were at all. Now I’m just tired of reading the man page. I give up. Imagemagick wins.)

That is more a product of complexity thou.

The other day I was doing some basic data collection and graphing, with the initial version being done in Excel, for something as basic as adding a trendline I had to resort to googling the answer, and following a sequence of several click this, then click this style to get what I needed.

Turns out it was actually easier to understand how to do what I needed using python matplotlib (not a CLI program granted).

Design, documentation and managing complexity are far more important to discoverability than GUI vs CLI.

it's not cli, it's the way it's designed.

had a pervasive semi structured fuzzy matching at the OS level so anybody writing a program can have a good completion without resorting to shell generators and enjoy easy discoverability.

GUIs are easier to explore right now but they also have lots of hidden state and obscure organization (good luck knowing which options interferes with who and where they are some times)

It's my responsibility to organize things such that what I need to do matches up with what I know how to do. If I need to do something urgently on an alien OS and I don't immediately know how to do it, the failure doesn't lie with the OS. Why am I not using an OS that I know better? Why don't I have time to read the documentation and understand this alien OS?

Because, unless you are an IT person, your job is to do work now, which is unrelated to the OS.

Your job is also to do work in the future. You should be prepared for the future.

That's really not how it works, though. Even fast food workers get training on the software and tools they have to use.

My job, as an infrastructure engineer, is essentially to figure out how to fit together parts that 1. I've never seen before today, 2. which have never been fit together before, 3. well-enough that nobody will ever need to touch the weld-point again.

There's no training that can prepare you for that, except for training in how to train yourself in something, quickly.

No, that would be the worst of all.

You'd have to hunt around for things, hope that whatever you were doing didn't have weird unpredictable effects, and was repeatable. While you could write down everything you typed, and some precis of what resulted, or even in some cases record the result (eg ssh-ing to a machine from inside an emacs buffer or something like that.

That depends on the built in help mechanism. If I was unfamiliar with linux and it was linux, maybe. It it was OpenBSD and I was unfamiliar with it, I could probably get everything I needed (for core OS stuff at least) purely from the man system, as long as I had some very basic idea of how to use the shell. OpenBSD has exceptional man pages, to the degree you probably can't understand until you've spent a few hours reading through some of them.

Personally I'd go for the old school, yet very reliable, way of R T F M.

That's an interesting question. My first response is: hell no.

But I think what's behind your question is the fact that part of UX/UI is to abstract experience and interface. So maybe if I first had to poke at an alien OS I would try its UI as part of familiarity in getting to know it, but ultimately as a system architect, hardware developer, and former O/S dev with 4 decades experience, I want to see how it really works. The UI is always the topmost layer.

I'd love to encounter a literal "alien" OS in my lifetime. That'd be a pinnacle of my pentesting & reverse engineering creds.

Good question.

I think the closest you can get right now is TempleOS. How do you feel about it?

This is a personal preference, but I wouldn't. With a terminal it's clear I have to type something in to use it and I can use knowledge of other systems to start trying things and learn what's going on.

And not all GUIs are tailored for novice users. I use StumpWM, and despite being a GUI system, it'd be very difficult for a new user without some instruction.

The answer is yes for all, as ASCII encoding for videos is a thing.

Samba gave me flashback to last week where I had an external drive on my Fedora workstation which I wanted to share with my network... external drive was NTFS. Many hours later, and an setenforce 0 it showed up, but no permanent solution.

That's the usability of common things in everything not major desktop/user OS, and always has been. OTOH, when I'm programming, I would NEVER give up the comfort of GNU stuff. It's just so convenient. Even the editors. Yes, VS Code is great, but oh man - in vim I have a persistent undo with branching, among many many other things.

It's actually amazing how, on the one hand, you have this advanced system for complicated stuff geared towards awesome usability and on the other hand it trips over itself on mundane things which keep it away from the year of Linux (GNU more like it) on desktop.

Samba is a "mess" because anything Microsoft is a mess to be compatible with if you are not MS.

That said, sharing files across networks is a hell all its own. In particular if you want something to just automagically appear in some GUI across the office/world/whatever.

Only "reliable" ones are those that require the user at the other end to know what address to enter to get access.

/me procedes to launch "youtube-dl -f22 https://www.youtube.com/watch?v=0-2Ja7T9YF8"

AFAIK, mpv calls youtube-dl internally.

Yes but it's easier for when you just want to open a link to watch. You get subtitles, audio channels, can skip to the part you want, without downloading the entire video. I think it's DASH but more options are available, it works for Twitch and many other sites (thanks to youtube-dl I guess). With a browser extension you can click on links and open mpv directly.

yes it does

I think there's a version of elinks (or one of the other text based browsers) that can display images if you are running on a framebuffer console

Not only that but there are tons of image viewer, video viewer, pdf viewer etc application for framebuffer (for linux). But this begs the question whether framebuffer can be considered "terminal". Also, to me it's still perfectly "terminal", if you use X and run a terminal emulator and live inside it unless you need to see a image, video or pdf. To me, the answer to this question is "yes".

The only problem with the framebuffer is that it adds a perceptible delay vs using a "raw" TTY. It's a small delay, perhaps a few ms, and I guess most people can't notice it, but it can be distracting if you live on the terminal.

You can actually see it if you cat a big file to the screen.

Framebuffer isn't an issue. Framebuffer and Javascript, however, is. You can opt to skip websites with that, but a lot on the WWW depends on it.

There's a solution to that though: running Firefox in the terminal with brow.sh [1]

[1] https://www.brow.sh

http://links.twibright.com/ This fork/version of Links has a fully graphical mode, which works in both X and the framebuffer.

w3m can also display images using the framebuffer, with the rest of it still in text mode.

Netsurf can be compiled for framebuffer, and that variant can also be used in X (and imo is much easier to get compiling than the GTK UI).


> Spreadsheets in the terminal: Totally doable.

I disagree. I find vim, mutt and pandoc superior to LO Writer and Thundrbird for text editing and emailing, but sc is just nowhere near LO Calc or even Gnumeric.

I really wish someone would make a text user interface for Gnumeric.

Have you tried Org mode tables? They have some spreadsheet functionality, but I'm not sure how big the overlap is.

Ok, that[0] does actually look pretty impressive. It can evaluate elisp formulas! Way more advanced that sc.

However, the sad state state of the world is that other people use Word and Excel to edit documents. I wish there was a program that would do to Excel spreadsheets what Pandoc does to Word documents: Letting you edit them in the terminal and allowing you to pass them off to other people in the formats they prefer.

Those other people would hate me if I sent them Emacs Org mode spreadsheets to work on.

[0]: http://orgmode.org/manual/The-spreadsheet.html#The-spreadshe...

Related to this, Omar Rizwan did a great talk recently on file system interfaces:

https://www.youtube.com/watch?v=pfHpDDXJQVg (Four Fake File Systems)

I used Alpine as my main email client for a while. I really like it, but I had problems with Fastmail's LDAP-addressbook. And if I remember correctly I couldn't use Office365's addressbook either.

Another problem was integrating Alpine with calendars (at least O365).

khal (with vdirsyncer) is a nice calendar app. I haven't got it working perfectly with Office635 yet though; I use a read-only url to the calendar, so I can at least see it, but cannot edit it. The Fastmail calendar works.

My solution has been to use the Fastmail website to accept calendar invites, but use khal and vdirsyncer to manager the majority of my calendar. As far as contacts, I believe Fastmail can import and export to vCard, which can be managed locally with khard and vdirsyncer as well. I use khard directly with mutt to access email contacts and I've been happy with it.

Having wasted my childhood lurking the (unfortunately stale) KMandla's precious blogposts [0] to find new ways to squeeze functionality out of my EeePC701, I did not really find this video particularly interesting.

Maybe i'm not the target audience.


Will the terminal ever disappear?

I would assume that the percentage of computer users who do significant work in a terminal (email, IRC, coding, etc.) is still decreasing steadily today. What I'm not sure about, how does this decline in penetration compare to the (still real) growth in the worldwide computer user base?

Seems that at least the Internet user base is still growing ~10% p.a. What does this mean for absolute terminal user numbers? I would guess they peaked already a decade ago, with people switching to / starting out with smartphones and tablets instead of notebooks/desktops.

I can also see a world in which a highly disruptive technology (e.g., brain-computer interfaces) would make the terminal redundant once and forever, assuming that the new technology was more efficient/productive than a terminal.

Maybe I'm a bit biased by my love for the terminal for most of my work responsibilities--I just can't imagine doing my job without it. Maybe I'm old and things from 10-20 years ago seem recent. Apple deciding to stick a terminal in OSX 20 years ago (a lot of people didn't expect them to include one) helped with the boom of Silicon Valley engineers choosing Apple laptops. PowerShell, first released only 10 years ago by Microsoft with major versions being released every couple years, shows they feel there's a need for investing in Terminals. Windows Subsystem for Linux was first released 2 years ago. While it's technically just compatibility with Linux binaries, I've only seen it being used via a terminal.

You're right, with more and more computer users added every day, as a percentage command line users have probably been shrinking for a long time. But I would also expect in raw numbers they're probably still growing.

Terminal is vital for reproducibility/automation. It's an interactive config file, where a config file must be portable

we have new terminals all the time, like browser javascript console

I really like using the terminal when it makes my workflow faster, using git etc. But checking email and YouTube? No thanks.

You know, there's even a text mode display manager for X. I forget the name, but it's on GitHub.


How can I watch this YouTube video using just my terminal?

This will display the video as ASCII art:

  vlc -V aa <youtube link>
For color ASCII he suggests

  vlc -v caca <youtube link>

Ironically, displaying that video with this method will make it unreadable.

Well, the OP should have posted it as https://asciinema.org/

does mpv have such codec out of the box ?

I didn't watch the video but you could use youtubedl and then ffmpeg to display it on the framebuffer

on the framebuffer

What manner newfangled luxury is that in a terminal? When I was young, we had to pick lice out of our hair and arrange them in the form of letters for others to read.

If you call something (watching videos/pictures or browsing in a framebuffer) which has been working since the end of '90s "new" then you've grown old. A while ago, if I might add. And the ability to use a CLI on a large say 1280x1024 resolution was nice as well, given the dots per inches on CRTs in late 90s.

Some people would kill the lice first but sophisticated users preferred live ones which enabled advanced effects like 'dynamic text' and 'escape sequences'.

In some ways I think we were actually happier back then.

But you try and tell the young people today that, and they won't believe you.

heh, this is actually covered in the video.

Use a framebuffer application e.g. mplayer.

you probably can make a whole virtual OS running on top of terminal

Yes, it was called Windows 95

lol, true. probably 3.11 too....

I used to use a terminal for everything! And it was a big step forward over punch cards. I remember using my first VT-100 in college, after having taken CS-101 to learn IBM 370 (BAL) assembly language on punched cards. And up until 1988 or so, we were using text terminals for everything, until I got a job at Adobe and started using a Sun workstation. Even then, the windowing environment was used mainly to open terminal windows.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact