Hacker News new | past | comments | ask | show | jobs | submit login
State of the Terminal (gpanders.com)
104 points by fanf2 20 days ago | hide | past | favorite | 157 comments



I could never really understand the enthusiasm. Why are we still dealing with over half a century of cruft? I get that this is a core piece of technology lots of stuff is built upon, and I'm not arguing to get rid of the classic terminal emulation altogether. But I wish there was an effort of building a new, modern, textual interface to computers with modern assumptions and integration bindings. We shouldn't need to be concerned with obscure escape sequences to print color; tooling shouldn't need to parse strings to do something useful; and junior developers shouldn't need to waste hours scrolling an obtuse man page until they resort to a half-assed SO response with broken parameters to extract a tar archive.

There is more to shells and text interfaces than working within constraints set 50 years ago.


In my opinion, the Lindy Effect[0] makes a lot of sense in scenarios like this. Personally, I love the fact that the slower evolution of command-line tools gives my personal skillset a longer shelf life. I can add additional capabilities without having to constantly re-learn how to do established work.

[0] https://en.wikipedia.org/wiki/Lindy_effect


The problem is to break away from ANSI escape sequences and the like means also rewriting 50 years of command line tools.

Like with the modern web, there’s just too much momentum behind the current design to make it practical to reinvent it from scratch

That doesn’t mean that things cannot improve though. It just makes it massively more difficult if you want to retain backwards compatibility. And if you don’t, then people probably won’t adopt it because they won’t be able to get their job done, regardless of how good the UX is.

As it happens, I am trying to solve these problems. In part with my alternative shell. To take your SO example, I’ve recently even playing around with integrating ChatGPT to help with hints (on top of automatic man page parsing which already happens).

I’m also writing a new terminal emulator that aims to bring interactive widgets (like support for sorting and filling anything that looks like a table). But that term is very much alpha at the moment.


Why can't we make it backward compatible so an interface has each client application say "I support the new interface" if it does? You probably would need to add kernel support to handle the fork/execve case, but I don't think that there is an intrinsic limitation.


fork et al are manage by the shell, not the terminal emulator. And what you're suggesting is exactly what feature flagging in ANSI escape codes achieves. 50 year old teletypes already had way to announce support for new interfaces.


> The problem is to break away from ANSI escape sequences and the like means also rewriting 50 years of command line tools.

Are we really that reliant on those command line tools? I only use a handful of them, and any time it gets more complicated, I reach for a real programming language to do the scripting. Those tools just do a bunch of string parsing, and the user interfaces are usually incredibly esoteric, inconsistent, and obtuse.


In a word: Yes.

The long tail is very long, and it's not just the bare text tools, it's the bare text tools (including tools that were written 30 years ago and that haven't changed since, and the tools that barely even count as cli tools), and also anything that ever did anything more interesting. You're not just talking about rewriting coreutils, you're also talking about redoing vim and emacs and top and all the *stat programs and tmux and screen and ...


Only in regards to UNIX CLI tools, computing history is full of other kinds of command line interfaces.


Sure. And aside from Powershell, which itself had to recreate a bunch of common UNIX idioms, which of those other command line interfaces are still in widespread usage?

The status quo is ugly but reinventing it is at least an order of magnitude more work than improving upon it.


Amiga DOS doesn't seem to go away no matter what.

IBM and Unisys mainframes and micros.

Smalltalk and Common Lisp REPL environments.

Although probably debatable to consider any of them mainstream.


The fact that you opened your rebuttal with DOS basically proves the point I’m making.

> Although probably debatable to consider any of them mainstream.

The only person debating that point is you.

We could be online until the sun rises debating about different command line environments but if AWS (for example) haven’t released an official CLI utility for Amiga DOS then your position is ultimately just an academic one.


Amiga DOS !== UNIX CLI, and if you don't get why, well so be it, lets worship 1970's printer hardware instead.


I'm giving you the benefit of the doubt that you're just mistaken rather than trolling:

1. I never said Amiga DOS was the same as UNIX CLI. In fact I never even compared the two, that was all you

2. I do get why different command line interfaces are different -- I author a significant amount of code towards terminal emulators, shells, command line tools and maintain a hell of a lot of retro systems. So I'm definitely experienced on this subject. In fact I bet I could teach you a thing or two on this topic too ;) But that wasn't the point of what was being discussed. We were talking about the state of the status quo, not how some niche interface that nobody has used for serious work in nearly 30 years compares to the entrenched standard.

3. I never once said the UNIX CLI was peak command line design either. In fact I actually said the exact opposite. What I actually said was that it was dominant. Dominance != well designed

4. ANSI escape sequences, the $TERM env var, and all the other terminal UX stuff that are being discussed here, came about with hardware terminals like the VT-series. Yes, teletypes are part of mainframe history, but they're not relevant to this specific discussion here. Terminal emulators don't emulate a teletype, the POSIX kernel does that. Terminal emulators ostensibly just open a file and emulate how VTs interpreted ANSI escape sequences. This is the same reason why you can have terminal emulators on Windows (like PuTTY, Microsoft Terminal, and my own terminal emulator) despite Windows never having a concept of a PTY. Though, perhaps ironically, Windows now does support an approximation of a PTY.

---

My point was very clear: the current status quo sucks but it would be an order of magnitude more work re-implementing everything from scratch and it ultimately wouldn't likely gain adoption anyway because of the momentum behind the current status quo. Microsoft understood this and ended up re-implementing some of the concepts despite literally decades fighting against it.

History is littered with examples of sub-par technologies becoming dominant because they work just good enough to maintain any initial momentum they have behind them. And people would sooner use what they're familiar with than learn something entirely new just because it is technically better.

And the fact that you keep harping on about Amiga DOS is, frankly, absurd. I love the Amiga, I honestly do. I have one sat next to me right now. But if there was one example in history of a command line interface that sucked more than the Bourne shell, it would be DOS. Mentioning Lisp machines might have earned you a little kudos yet you chose to lead with Amiga DOS.....


You were the one that was so eager to reply that didn't even bother to go past the first line of my comment, and that says it all.

All the non-UNIX platforms eventually had to come up with POSIX support, because apparently people cannot stop to see UNIX on everything that has a CPU, other platforms be dammed, they better come up with POSIX support, including terminal escape codes.

The only reason to use PUTTY on Windows was and is, UNIX software running under mingw/cygwin.

Same with Windows Terminal, and naturally Microsoft <3 Linux with WSL, again UNIX like software.


I made reference to stuff past your first line. Maybe you missed it because you didn’t read the entirety of my comment ;)

Anyway, you’re now just repeating exactly the sentiment I made that we were originally arguing against.


VMS used ANSI escapes too, did it not? Windows is moving that direction. Amiga as well? CP/M? Not much left.


There is an effort, but it’s mostly built on cruft (which libraries like ncurses or the projects at charm.sh try abstracting away)

Also, what would we use besides escape sequences? They are about as small as can be for what they do.

Terminals deal with streams of bytes. They don’t just download entire views and render them (normally.) This normally works to their advantage.

There’s also the ansi standard control sequences, which imo should have more focus on them as opposed to the terminfo db.


> Terminals deal with streams of bytes.

But is this a fundamental constraint, or something we can challenge? Is there any good reason why terminals can't deal with structured data instead?

> There’s also the ansi standard control sequences, which imo should have more focus on them as opposed to the terminfo db.

Everything in that sentence \033[31;1;4munderlines\033[0m the problem to me: If we didn't need to nest metadata into the byte stream, none of us needed wrestle with escape sequences. And that such a thing like the terminfo db even needs to exist speaks volumes about the lengths we have to go to to accommodate for decades of cruft…

Applications should neither be concerned with what color codes the output device can render, nor should the terminal itself have to support hundreds of emulation targets.


I think it’s a fundamental design choice. It’s a double edged sword for sure, but it’s important.

It allows terminals to respond very quickly because they never need to wait for an entire data structure to download before displaying information.

It’s such a simple protocol that anyone can hack on it (which is a strength and weakness)

Asking to change that, in my mind, is like asking to change how packets move across a network. It’s all byte streams that get treated as structured data higher up on the abstraction layers.

TUI frameworks like the ones from charm.sh are built atop the base protocol and provide support for structured data much like how our network stack works. That way, as you were saying, application programmers don’t need to worry too much about the underlying “wire protocol”

The issue isn’t that terminals are stream-native or that they operate on byte codes instead of characters.

The problem is lack of standardization of said codes. Which leads to a mountain of edge cases in supporting every known terminal. That’s the cruft, not the minimal protocol


> But is this a fundamental constraint, or something we can challenge? Is there any good reason why terminals can't deal with structured data instead?

Plenty already do: - Powershell - Murex https://murex.rocks - Elvish https://elv.sh/ - https://ngs-lang.org/

> Applications should neither be concerned with what color codes the output device can render, nor should the terminal itself have to support hundreds of emulation targets.

If you have colour codes (et al) sent out-of-band then you need a new kind of terminal emulator which the application then also needs to support. So you do effectively create yet another standard.

Whereas the status quo, as much as it sucks, is largely just vt100 with a few extra luxuries, some of which are as old as xterm. We aren't really talking about having to deal with hundreds of emulation targets, nor even more than one, in most cases.

Where things get a little more challenging is if you want stuff like squiggly underlines or inlined images. There is the beginnings of some de facto standardisation there but it's still a long way from being standardised.


> But is this a fundamental constraint, or something we can challenge? Is there any good reason why terminals can't deal with structured data instead?

What do you mean by structured data? The escape codes are structured. And any structured dat would be sent as a stream of bytes.

Also consider that if everything had to json messages, that would mean that programs that didn't care about controlling a terminal would need to determine if their stdout was a terminal or a pipe, and properly format content if it was a terminal.

I think the concept of escape sequences is fine.

Although, I do wish that there was a way for the terminal and application to communicate out of bound. In particular that would allow querying the terminal without having to worry about getting unexpected input before the response, and you could send graphics or clipboard data out of band without tying up the terminal itself.

> But is this a fundamental constraint, or something we can challenge? Is there any good reason why terminals can't deal with structured data instead?

I mostly agree with that. Although it brings up the question of how an rgb color should be converted to a more limited color set. Should it approximate the nearest color for some definition of nearest, or should it ignore the color setting altogether. Though I think it would be better for that complexity to be centralized in the terminal instead of having to be implemented in every application.

> nor should the terminal itself have to support hundreds of emulation targets

I definitely agree with this.

I think we could use a standard terminal control spec that standardizes the things that modern terminals mostly agree on, and removes things that aren't really used anymore. And have a standard mechanism to query for optional or vendor specific capabilities. And write some decent documentation for the control codes (which if it already exists I haven't been able to find, at least in a single place).


There are libraries and frameworks that abstract this for you: ncurses, bubbletea, etc. At some point, every interface is low-level: GPU geometry is a stream of vertices, a program is a stream of bytes. This isn't some terminal-specific tech problem, it's just kind of how computers are.


Abstractions are layered a specific way, though. Layering upon something suboptimal will always be limited by the suboptimal layer.

Ncurses or bubbletea et all may hide it from you, but they still only paint over the bumpy legacy wall below without being able to really improve on it.


"\033[31;1;4munderlines\033[0m" is (again) no worse than a stream of vertices or a stream of object code. Everything is a stream of bytes (well, a stream of bits anyway). Do you want CSS? Lipgloss is not too far off [0].

I read your objection basically as "escape sequences and control codes are noisy garbage"; are you saying something more like "the functionality you can achieve with escape sequences and control codes is fundamentally limited"? If that's the case, I don't see how, especially in the context of a character-based display.

[0]: https://github.com/charmbracelet/lipgloss?tab=readme-ov-file...


I don’t think it’s suboptimal at all. (How many bytes does it take to change the color of a single word in the terminal protocols vs the web for example)

It’s literally a wire protocol.

It’s minimal and designed to be easily abstracted away.

The suboptimal part is the lack of standardization.


It may be a wire protocol, but it’s also the primary method of interaction with the computer for technical folk. Do we really have to constrain ourselves to a wire protocol here, or is there maybe room for something more user-friendly?


We, as users, do not interact with the wire protocol. We type at generally very user friendly terminals, which abstract the wire protocol.

Even TUI devs don’t usually interact with the protocol directly, they use ncurses or something from charm.sh to abstract away the protocol.

Moreover, what “constraint” does using a wire protocol like this impose?

And how would one interact with any computer without a low level layer of byte-encoded information? (You can’t. It’s how computers work)


Is there? Lets put a few reasonable requirements down:

- Self-sufficient commands that you can store in history, put on wiki page, put inside a script file, Slack to your friends, store in configuration, etc...

- Variety of execution methods: local computer, remote via ssh, jupyter-like notebook, remote via something else (like AWS SSM), CI runner, ssh which launches SSM session which connects over serial port, cron-like periodic schedulers, starting commands in your Go program etc... Every method should be supported automatically with no effort from developer.

- Output could be interactive (possibly with some reprocessing like tmux/screen does) or stored (like CI/cron execution). It is possible to parse output to get relevant details.

Even if you start from scratch, what can you design to fit this pattern? You'll get something very close to existing state - applications that take command lines, character-based input/output streams with some sort of formatting sequences.

Sure, if I could design from scratch I'd standardize on _one_ TERM, make escape sequences easier to parse (longer and common start/end chars), fix extended keys/numpad mess, redesign the CLI defaults of the few tools, create common command-line completion interfaces... but the overall idea will be the same.

(And if you don't care about the three requirements above, then go with HTML or native UI! There are tons of them and they are well supported. But I don't see the point of retrofitting HTML-like functionality to exiting terminals)


There definitely are more options than applications reading and writing character streams, you’re just not even considering them because you’re too entrenched in the environment you’re familiar with. Take Powershell for example; the syntax may be horrible, but the way it passes structured data is truly different and way ahead of classic UNIX shells. The design space is huge; just dismissing any possible improvements before even sitting down and thinking properly is frankly not a good strategy.


If the design space is so huge, how about giving some examples, instead of insulting people who disagree with you? Surely you'll be able to do so, while satisfying the requirements I listed?

Because Powershell doesn't satisfy them. You are not going to save Powershell object stream yo CI output or provisioner logs. You are not going to send objects over SSH, or over web console, or over BMC-emulated serial port. In all of those cases it's going to be good old stream of characters, with the occasional control sequence from more advanced programs.

You can have whatever rich datatypes inside your application (powrshell isn't the first, I remember reading about LISP shells back in 2000's), but at some moment you have to talk with other systems, and that's when you will have to switch to character streams.

(that said, the current escape sequences could use lots of improvement. That's not going to change fundamental concept though)


> There is more to shells and text interfaces than working within constraints set 50 years ago.

Of course.

With some minor changes, you could make the same argument many, many technologies we use every day, from IPv4 to SQL to C.

Then there is the old saw -- likely apocryphal -- about how railroads are the width they are because they evolved from standards around Roman roads. Even if it is a fairy tale, the moral of that story strikes true: existing tools are built on top of old ones, because it was better to have a standard then even if we may know better now.


IPv4 is a great example, because the transition to IPv6, albeit slow, is happening. And that is an all-renewed protocol with a lot of previously impossible, awesome features. There is a way to do this right.


It’s also exponentially more complicated, has addresses that are a mile long and impossible to memorise, and has enabled such marvels as a toaster that outright refuses to make you breakfast unless it’s connected to the internet with an active subscription and all the telemetry that comes with it. Truly a great example of a way to do things right.


While we're at it, can we get rid of staggered keyboards? We don't need to accommodate mechanical linkages from the bottom rows any longer, yet here we are.

As other commenters have pointed out, once conventions and standards have been adopted, there is almost no way of dislodging them. It's a coordination problem. New standards and approaches usually need to be at least a 10x improvement over the status quo in order to be adopted organically.

The same story applies to all areas of human endeavor.


Im not so sure. We have HTTP3 now, which is binary. We use different image formats than we used to, and switched from horses to cars, too. Why shouldn’t we be able to innovate in this space? Building new walls doesn’t automatically imply tearing old ones down.


> Building new walls doesn’t automatically imply tearing old ones down.

But often in this space they seem to want to do that anyway. I can only imagine that they are attempting to force people to adopt their pet project. There's literally no reason a pair of sequences can't be specified that allows clients to query support for advanced terminal mode and get a response. After the client and the terminal have agreed on advanced mode, switch to a byte stream/command blocks/whatever which allow rich text, graphics, sound, access to UI devices, etc. Whan you're done you've recreated X11, but they can do whatever they like I guess.

There's no need to fuck up terminfo along the way though.


> But I wish there was an effort of building a new, modern, textual interface to computers with modern assumptions and integration bindings.

Would it be different from a web-browser? Isn't the webstack already the modern textual interface everyone is using? I mean that's also the direction they all are walking to, with adding images to terminals, having more complex layout with TUIs. I think I remember some TUI-toolkit which even used a limited CSS for describing their interface. And there are some terminals and alternatives which are build with NodeJS and webstack IIRC. So would it make more sense to find some proper standards for webstack-based cli-interfaces?


> Isn't the webstack already the modern textual interface everyone is using?

I would argue that the web stack is really more of a virtual machine, or OS-agnostic application runtime, but it is not a textual interface per se (albeit it certainly can be used that way).

On the other hand, I'm looking for solid primitives to make applications talk to each other, but with structured data instead of plain text open to liberal interpretation; for terminals that don't emulate a teletype device, but make use of modern operating systems. Think of what we had before systemd—the wild west of shell scripts, layers upon layers of arcane compatibility hacks—and what we have now—a standardised, documented, structured, system management interface that works the same way everywhere (please don't lets discuss systemd here, it's just an analogy).

So rather than trying to paint NodeJS-lipstick on the TTY-pig, I'd like to see a new kind of terminal protocol that solves the UX issues of old.


> I'm looking for solid primitives to make applications talk to each other, but with structured data instead of plain text open to liberal interpretation

JSON exists. YAML exists. XML exists. The lack of proper exchange-formats is not a problem. The lack of tooling is the problem.

> Think of what we had before systemd—the wild west of shell scripts

Liberty. And this gives you an answer why terminals, shells and cli in general remain this way. People have the liberty to do what they need, and they have the liberty to go as simple or complex as they want. And this works very well. Don't forget that not all output is structured.

So unless you have someone creating a whole new set of tools and environment which can compete with the established solutions, and beat them on ALL aspects, and is still compatible...until then people will not support it. I mean, there are more than enough alternatives who are barely popular for one reason or another. Speaks for itself.

> I'd like to see a new kind of terminal protocol that solves the UX issues of old.

Which are the issues? Is there some solid documentation on them?


Chesterton's Fence is in full effect here. Quite a few people have tried to do exactly what you wish. They haven't really succeeded, obviously.


Me neither, having started to program when this is all we had, and graphical displays costed a fortune, I really don't get this desire to live in the past.


tooling shouldn't need to parse strings to do something useful; and junior developers shouldn't need to waste hours scrolling an obtuse man page until they resort to a half-assed SO response

Absolutely. Glad we fixed all that with CSS...


Could you share what you mean by that? I suppose it was sarcasm, but I didn’t get the point.


Ha sorry, that was probably overly snarky. I was thinking of CSS's multitude of string-valued properties that effectively have their own DSLs, ranging from the simple "5px 10px 0px 10px" for padding/margins to much more complicated expressions for gradients and animations. And it's all stringly-typed so (without additional linters) you don't get told when you format values incorrectly or set mutually exclusive properties. But there are also lots of advantages in terms of flexibility and ease of use (the "API" is just "style.foo = bar"), and that's led to CSS being ubiquitous and extremely successful.

I think this applies to some of your complaints about terminals as well. Yes escape sequences are ugly, but it means I can just emit the appropriate bytes from any program written in any language rather than figuring out what library I need to import to generate a correctly formatted structured message, and operating on the byte stream level lets it transparently work across remote connections. I'm all for experimenting with new approaches, but we shouldn't lose sight of the existing benefits.


CSS doesn't fix junior developers [wasting] hours scrolling an obtuse man page until they resort to a half-assed SO response


Arcan is one such effort, but I am not sure what state the project is in.


I'm bewildered that the "user friendly" flavors of linux still use ancient terminals. If the fact that these distros still lean heavily on terminal use wasn't bad enough, you also need to have a computer intuition from 1985 to feel comfortable using it. At least capitulate to ctrl+v and ctrl+c.


> I'm bewildered that the "user friendly" flavors of linux still use ancient terminals.

Do they? I was given to believe that you can use modern Ubuntu/Fedora/OpenSUSE without needing to open a terminal.

> If the fact that these distros still lean heavily on terminal use wasn't bad enough, you also need to have a computer intuition from 1985 to feel comfortable using it. At least capitulate to ctrl+v and ctrl+c.

Okay, let's say we're going to break backwards-compatibility; how should the user kill the running program, and how should they input character literals, and how are we going to implement your change?


Just to get it out of the way - Linux is great if you are a grandma or linux junkie. It sucks for everyone in between, especially for those who come from Windows or MacOS.

--

This is the fundamental problem with the terminal, it is extremely powerful when you are intimately familiar with using it. And it's unsurprising that people building distros and maintaining linux fall into that camp.

What they are completely blind to is how incredibly user hostile the environment for people from a GUI complete background. What is the first thing every mainstream Unix based OS does? They take the terminal and hide it. They make menus and menus of GUI elements that cover 80-90% of the things you would typically use the terminal for aside from the 5% of ultra power users.

My gripe is primarily that Linux is _desparetly_ needed now more than ever as an escape from windows. But the people who are working on linux distros are so lost in their egos that they are arrogantly trapped in this idea that everyone needs to be driving stick shift in 2024 because look at how much more control you have compared to an automatic, if that analogy makes sense.


> My gripe is primarily that Linux is _desparetly_ needed now more than ever as an escape from windows.

If all you see Linux as is a crappy free clone of Windows with less user abuse, you'll certainly be left disappointed when it doesn't deliver.

For all their faults, Microsoft pours a ton of money into Windows's usablity and backwards compatibility. If anything, it's impressive Linux DE's come close with a fraction of the funding and organizational structure compared to a literal tech giant.

> But the people who are working on linux distros are so lost in their egos that they are arrogantly trapped in this idea that everyone needs to be driving stick shift in 2024 because look at how much more control you have compared to an automatic, if that analogy makes sense.

Is it really "ego" if the people working on these distros simply like it better this way? Are they wrong for developing the program (oftentimes in their free time) to accomodate the way they, and most people who contribute to the project, enjoy using it?

Sure, it would be great if we could accommodate everyone. But as most FOSS projects are chronically understaffed and underfunded, people prioritize creating a product they can enjoy using.


>Is it really "ego" if the people working on these distros simply like it better this way? Are they wrong for developing the program (oftentimes in their free time) to accomodate the way they, and most people who contribute to the project, enjoy using it?

From Ubuntu's missions statement:

>We believe that bringing free software to the widest audience will empower individuals and communities to innovate, experiment and grow.

I'd say they have been failing catastrophically at that for the last 20 years. And so do the statistics.

If they could just dedicate two releases to snuffing out as much terminal use as possible, they could probably double their market share in a month.


This is a silly thread, beginners don’t even know the terminal exists, and 99% don’t need to.

The situation is the same in Windows, though they might have a few more duplicated GUIs, they don’t solve the long tail of troubleshooting. You can also use a Mac Keyboard with Linux.

It doesn’t make any sense to remove terminals either, when available and tiny in resource use. Use an OS modeled on the original Mac OS if you want to be prevented from seeing a terminal.

Utopia never existed in the computing world. But the war on general purpose computing sure has. Soon there will be no place to turn for an experienced user who is not a slave to bigcorp interests. Unfortunately the bondage you recommend leads here, they are now coupled.


To extend, this issue affects an incredibly small number of computer users. The 1% that knows it exists, divided by the large fraction that is too busy (or lazy?) to work around it:

https://news.ycombinator.com/item?id=40390656

The reason it falls between the cracks is because it doesn't affect enough people, especially normal folks. But let's not kid ourselves that integrating one of the existing fixes would affect the industry significantly.


> Okay, let's say we're going to break backwards-compatibility; how should the user kill the running program, and how should they input character literals, and how are we going to implement your change?

Please don’t take it personally, but I find it a bit sad that your imagination ends here. These two questions depend entirely on the chosen implementation. If we’re talking about a new implementation, should we really start by accepting the limitations and constraints of the existing system as requirements for the improvement?


> I find it a bit sad that your imagination ends here.

This isn't the limit of what's possible, it's a lower bound of questions that you have to answer if you want to build a replacement.

> These two questions depend entirely on the chosen implementation.

Okay? Feel free to include multiple answers, but you need at least one.

> If we’re talking about a new implementation, should we really start by accepting the limitations and constraints of the existing system as requirements for the improvement?

If you ditch all the behaviors of the old system without figuring out how to shim them in, then you can build a beautiful new system that's free of all the problems of the old system, and that's also free of all users because you just jettisoned all the programs that let people actually do useful work with their computers. We're stuck with 50 years of backwards compatibility not because we love it, but because people need to be able to interact with their machines without relearning everything, and people need to be able to use the machine to run the programs they're using.


Instead of a specific key combination that a) clashes with the way the vast majority of computer users enter text, b) is neither obvious nor easily discoverable, and c) not available or awkward to enter depending on the input device, we could also separate the monitoring of running processes from the command prompt and let the terminal application offer a control to cancel them, or match modern user expectations and do so using the Escape key. Or entirely different.

The point is that we don't use ^C because it's ergonomic or obvious, but because of conventions inherited from a time long gone. Keeping legacy software running does not imply imposing the same UX constraints on users.


> Instead of a specific key combination that a) clashes with the way the vast majority of computer users enter text, b) is neither obvious nor easily discoverable, and c) not available or awkward to enter depending on the input device,

I grant that a) it's different from everything else, but b) CUA shortcuts aren't obvious or discoverable either, they're just more common, and c) I can't think of any device that has a terminal where ctrl isn't readily available. Nonetheless, I actually agree that if we could do it without breaking everything, moving everything to the same convention is compelling on its own. (Of course, I don't think we can do it without breaking everything, but that's life.)

> we could also separate the monitoring of running processes from the command prompt and let the terminal application offer a control to cancel them,

When I want a process dead, I want it dead now, not after I've found the mouse and hit a button. I also have concerns about how that button is going to actually work under the hood; the only options I can think of are that that sends a simulated ctrl-c, or the terminal emulator suddenly has to track child processes and decide what to kill (which strikes me as a bad idea).

> or match modern user expectations and do so using the Escape key.

And in so doing break anything that was using escape, like vi. Incidentally, the plan to intercept ctrl-c will break emacs, too. Which is why I strongly disagree with

> Keeping legacy software running does not imply imposing the same UX constraints on users.

because breaking legacy software is exactly the trade you're proposing.


GUI Task managers have existed for decades.

If this issue is really bugging you:

- Buy a cheap Mac style keyboard and configure it as such. See kinto.

- Use a terminal with “smart paste” for a decent workaround.


Would Ctrl-c work conditionally based on if there is a foreground process? What if I want to paste in to the process?

The command key on macOS really comes in handy here.

The “windows” key on most Linux desktop environments I have used is usually pretty under-utilized, perhaps there’s a case to be made that super-v should default to paste.


<S-Ins>


While I don't disagree with your sentiment I have a few comments:

"Why are we still dealing with over half a century of cruft?". IMHO that's because we have software running that's over half a century, with organizations depending on it, and so it needs to continue running. This means we need to continue providing that software with the environment it expects. For over a decade, a significant part of my recurring revenue came from helping companies make sure their ancient software would continue to run on newer machines (and they only got newer machines because it was no longer possible to keep the older hardware running). And this was on PC hardware, from my short experience in finance, I reckon there's a bunch of 390 software still running emulated on modern z/OS machines.

"tooling shouldn't need to parse strings to do something useful" The thing is tooling either needs to process some binary format, which is more efficient but also more obscure, or to parse strings. All the tooling that deals with json, yaml, etc is still parsing strings, and I prefer that to a binary format in most cases. I know a defined format like yaml is simpler to parse than free-form text, but it maintains some of the same constraints (notably the need to escape things).

Your comment about escape sequences to print color is very relevant. I feel two concurrent yet opposite feelings here: I get all warm inside from nostalgia since I spent what probably was an unhealthy amount of time learning about ansi sequences back in the day of BBSs, and I also despise that completely and would love to be able to have color on a textual computer interface without the need for that. I believe that's possible, but most likely, because the effort to achieve that is significantly bigger than the effort to keep hacking what we have, is why we don't have a new textual computer interface. Challenges I can imagine: while I despise dealing with escape sequences (because I invariable got them wrong the first time, always), it's either something like that, or a side channel to convey formatting information. Or we move to html altogether. I guess this is the approach of text-intensive applications based on web frameworks (like I suppose most modern editors). There's the option emacs takes, which is to just have the text be plain text (whatever that means, really, we like to think it's a simple format because we can cat it and wysiwyg, but for the computer, it's all a bunch of bytes anyway) and use modes to alter how the editor shows you that text. As an emacs user I'm biased to think that's a better approach than html or a side channel, but I don't see that becoming the norm in the short term.

"and junior developers shouldn't need to waste hours scrolling an obtuse man page until they resort to a half-assed SO response with broken parameters to extract a tar archive" Now that is one hill I'm willing to die on: if someone can't figure out how to extract a tar archive just from looking at the synopsis on the man page and scrolling through the options, unless the archive was maliciously named to hide the fact that it's gzipped or something, I think that person would be a pre-junior developer and still have a way to go to become junior. Note I'm not saying all developers must use the command line: if you prefer to open archives with a GUI tool that's fine, but if for whatever reason you need to open it from the command line and you can't figure that out from the man page, I think it says more about them than about tar. Perhaps it's just a poor choice of example, as I agree there are very poor man pages out there, but for the basic use cases, I think tar is as understandable as it can get.

"There is more to shells and text interfaces than working within constraints set 50 years ago." Completely agree, which is why I often use the emacs shell, which is not a POSIX shell, and you know what? I don't care. I go to zsh when I need that, but often times, the emacs shell gets me what I need with less friction. And I'm sure the same would be possible with other shells. I wish we had more new shells that attempt to keep what seems useful about 'the old shells' but discards all that's not strictly needed and adds new, useful features.


> Now that is one hill I'm willing to die on: if someone can't figure out how to extract a tar archive just from looking at the synopsis on the man page and scrolling through the options, unless the archive was maliciously named to hide the fact that it's gzipped or something, I think that person would be a pre-junior developer and still have a way to go to become junior.

There’s “let them figure it out” and there’s “this is unnecessarily complicated because of archaic legacy compatibility”. It should not take multiple parameters to untar+gzip a file.

There is absolutely no reason why, if you provide tar a single .tar or .tar.gz with no other parameters, it doesn’t just say “ok this is a tar/gzip file and you probably want to extract it”.


> There is absolutely no reason why, if you provide tar a single .tar or .tar.gz with no other parameters, it doesn’t just say “ok this is a tar/gzip file and you probably want to extract it”.

The default behaviour is being held back from backwards compatibility allowing tar options to be passed as arguments (e.g. `tar xzvf foo.tar` functions the same as `tar -xzvf foo.tar`) though the program could just check if the argument is a valid path to a tar file.

While we're at it though, the most annoying aspect of `tar` has got to be the requiring the `-f` option. I don't see why it's required instead of just taking the first file passed as an argument as the input/output tar file.


> if someone can't figure out how to extract a tar archive just from looking at the synopsis on the man page and scrolling through the options, unless the archive was maliciously named to hide the fact that it's gzipped or something

At least with GNU tar (and I think Darwin and some others), compression doesn't matter any more; `tar -xf foo.tar` correctly autodetects the compression and extracts even when foo.tar is gzipped.


> IMHO that's because we have software running that's over half a century, with organizations depending on it, and so it needs to continue running.

Well; as I said, I'm absolutely not arguing for somehow removing that feature—having the existing terminals continue working is definitely unavoidable, for the reasons you stated. However, that shouldn't mean there's zero movement in the space. We have several coexisting approaches to everything from operating systems to processors to web browsers; keeping legacy software functional shouldn't be an argument to hinder innovation in the entire space.

> All the tooling that deals with json, yaml, etc is still parsing strings, and I prefer that to a binary format in most cases

There is a subtle difference here: A tool reading JSON from standard input and parsing that is just parsing strings, too, yes. However, what about a platform that passed messages from one tool to another on some kind of channel designated for passing JSON messages, exclusively? I don't think we should just have a --json parameter for GNU utils, but really a formalised way for unrelated applications to exchange data, without each of them having to fend off parsing errors. And reality right now is even worse, with every one of those utils having their own, bespoke output format.

Re: ANSI escaping: This ties into the same point. Applications should have more to their disposal than just reading and writing strings. If there were a way to attach metadata to input and output, we could pass on stuff like what should be coloured how, without requiring escape code parsing downstream. I'm inclined to agree with you on all of this being intensive effort-wise, but that hasn't deterred us in other cases either.

> if someone can't figure out how to extract a tar archive just from looking at the synopsis on the man page and scrolling through the options, unless the archive was maliciously named to hide the fact that it's gzipped or something, I think that person would be a pre-junior developer and still have a way to go to become junior.

I can see where you're coming from, but that's just gate keeping. The terminal is great interface, but does it really have to as undiscoverable? IDEs offer so much introspection into source code-- why can't we even have that for the terminal (except shell-specific completion hacks, which are just a poor approximation of what we should have). I'm not saying we should dumb it down to enable any fool to extract that archive, but UX-wise, it could be a lot easier to learn how something works. And yes, tar was a bad example. How about, say `ip`? Been using it for years, and still cannot remember how to do basic stuff with it without looking it up.


There are tons of efforts like that. https://xkcd.com/927/


First, dont like it? Dont use it. No need to cry over it.

Second, dont like it? Write something better, do show case and maybe people like it and start using it.

Third, not everyone needs to use 100s of MB of memory to render some idiotic emoji along the text using GPU accelerated routines. Those old legacy stuff is lightweight, its everywhere so I can run stuff on platforms that problem existed before you were even born.

I myself love CLI. Its reliable, easy to use, fast!! interface. I cant really imagine a computer without CLI.. Its like? WTF? :)


While no one is entitled to other's time for free software, your comment is needlessly dismissive and discouraging. People can have legitimate complaints and try to rally others to change the status quo. No one is encroaching on your right to use the software you choose. You also seem to have preconceived notions of what GP envisions. You can stick to your own thing, but don't stifle the creativity and fervor of others.


But I dont see it as being creative. He is just complaining, not even giving any new ideas.


> First, dont like it? Dont use it. No need to cry over it.

I have to use it. That's the point. The CLI is an important and great piece of technology; I just don't love the way things are, and would prefer to see them improved instead.

> Second, dont like it? Write something better, do show case and maybe people like it and start using it.

This is not how stuff like that works. Terminal interfaces are an ecosystem; there are a lot of different projects attempting new approaches, but there's no shared sense of the existing problems—as proved by your comment. I've written my original comment to make more people aware of those issues, increasing awareness.

> Third, not everyone needs to use 100s of MB of memory to render some idiotic emoji along the text using GPU accelerated routines. Those old legacy stuff is lightweight, it's everywhere so I can run stuff on platforms that problem existed before you were even born.

I didn't argue for that; in fact, I didn't even argue for removing the existing legacy stuff. Maybe try reading the original comment again? Just because we have a working solution doesn't mean we should stop innovating. There are tons of UX improvements possible beyond rendering emojis, such as proper inter-process message passing via standardised interfaces- just imagine piping from `ls` to any other process, and have that process know that it just got a list of file references, instead of a bunch of characters (Powershell does something similar). Let terminals offer completions like IDEs do. Let them render proper progress bars, without abusing braille characters. There's so many things people do just because they did them like this since forever, they stop questioning if there isn't a better way.


Well, innovating or start bloating? Usually I see things being overengineered because we MUST innovate. No innovation = going backward.

Progress bar? what for? Is something like that not enough?

Transmiting files: 3/20 (12%), 675kB/s

Or sth like that. Far more informative that just progress bar. And can render anywhere, easly, over 9600bps too.


Sure, that is a progress bar. And it's something countless devs have wasted hundreds of man-years reimplementing over and over again, instead of delegating to the terminal application. Because we're stuck with handling transparent character sequences.


This makes no sense to me.

There is only one terminal that user is going to use, and as an app author, you cannot choose which one is it. There is a good chance that your app will run in terminal which was last updated many years ago, and you cannot do anything about it.

On the other hand, you can link to any library you want, and as an app author, you have full (or almost full) control over it. There is a good chance your app will use the libraries you want (especially if its in modern language), and in the worst case you can vendor library locally.

It follows that we should have as much functionality as possible in libraries, and as little as possible in terminals. Your progress bar? Should be a library. If there is a terminal support for it, it should be an optional low-level "eye candy" (some might even call it "bloat"), with all the real logic (like ETA estimation, rate limiting, and formatting) implemented in libraries.

This applies to all your other ideas, too.

"proper inter-process message passing via standardised interfaces" - that exists, and has nothing to do with terminals. Depending on exact details, it is called JSON, DBUS, XML, etc... If you want Powershell-like "ls | select .name", then you don't need to mess with terminal either - it's all shell functionality (maybe with some env var conventions).

Terminals already offer completion like IDEs do, try hitting TAB. I agree it might be nice to use a different font for those, but there is no need for redesign, you only need a new escape sequence like "start/stop floating window" (but again, some might call this "bloat")


> First, dont like it? Dont use it. No need to cry over it.

> Second, dont like it? Write something better, do show case and maybe people like it and start using it.

Anytime I use a graphical tool, command line afficionados effectively reel in disgust. So I'd say that it's those people who need to get over that some people don't like typing in a terminal to do their work and want to work differently and often more efficiently.


GUI is more efficient? Joke.. JOKE :) GUI have it uses, for example data presentation and visualization. Oh I have nice tools here too. But data manipulation and query? CLI only or mixed where you query in CLI, got graph output.


The terminal is such a wonderful environment. It makes many tasks joyful.

I think it stems from its simplicity and logic.

You have a simple grid of characters. You type the name of a program, hit enter, and the computer executes it. The output of the program is written below your prompt. Below the output, you can execute the next program. You can chain output and input of programs, allowing all kinds of data flows. The programs can also take over the whole grid and thereby allow all kinds of interfaces.

I can't think of a way to describe a window manager or a browser in similar, simple terms, which would explain the basis of all the usefulness they bring.


Agreed. My only real problem is that font size and "UI" size (gutters/etc) being tied feels like an accessibility tax. Aside from that, they're wonderful.

I wonder if we could eventually make a "new term" which shares the same foundation, but atop a rich rendered experience. A GUI written with simple primitives (that could also probably be rendered as TUI, frankly), but with slightly more slick (and expensive lol) rendering.

I know, i know - blasphemy, i'm sorry. But despite my love for the Term (i've been in it for ~15 years now) i long for _slightly improved_ graphics.


Emacs seems to fit what you're describing pretty closely. It can render as a GUI and TUI even though the GUI can display things the TUI can't (like fonts/images)


It's almost like parent was intentionally hinting for someone to mention Emacs. Probably not, but the degree to which they described Emacs was kinda funny. It may be old, but it's been evolving, and I absolutely love it still. Especially now with LSP. And it's not even hard to get into, when things like Doom exist.


Agreed with the siblings, you're describing emacs! :)


It’s called emacs.


I thought Emacs was primarily an editor? Not a general purpose shell, subshells, muxer, etc - not a terminal. I thought emacs would be a singular app?

Eg it sounds like you're saying that Vim is the Terminal if Emacs is the thing i'm describing?


A quick off-the-cuff remark based solely on the title: in 2024, I think the state of the terminal has never been better, in large part to Microsoft making a high quality terminal easily available to everyone on Windows [1]

As an application author, I love being able to assume that all major platforms have a good terminal and that my favorite terminal rendering libraries should Just Work on all of them

[1] https://github.com/microsoft/terminal


I mainly use microsoft terminal now just beacuse it did a better job handling more of the font decorations (italic was one that I recall). My one gripe though is that it seems like microsoft terminal is still kind of slow at scrolling. If I accidentally grep for something that spews a lot to the screen, terminal is just plain hung up a bit. And for many of us - but not everyone? - control-space in emacs is just kind of broken. Long standing open issue for that one.


Have you turned on the new render engine in the microsoft terminal settings? It is marked beta but should be much faster.


I really dislike that they re-used the TERM value from xterm instead of getting their own merged into terminfo. And then not documenting which sequences are actually supported and which aren't.


You would be surprised how much software is built exclusively for TERM=xterm or TERM=xterm-*.


I'm aware and i think its not okay. Its incorrect software.

And yes, that difference is significant because the F-keys, Ins, End and companions won't work.


You're getting downvoted but you're not wrong. Using TERM=xterm-* should be deprecated, and in this case in particular that likely means stepping on some toes.


It's like the user agent string in browsers. Using that instead of checking for feature availability directly is always going to end up in a mess like this over time. Heck, there wasn't a Windows 9 because of how much software would break because they had `if (osversion.startswith("9"))` check to decide if it needs to act in pre-XP mode, long after XP itself was EOL. Most people aren't willing to give up functional workflows on principle alone. "stepping on some toes" massively understates the impact.


Which is exactly the root cause of terminal weirdness.

Instead of a standard, we have mountains of terminfo entries which are mostly slight xterm variants and ancient hardware


Sounds like the user agent string in web browsers.


They have their own, and it's documented.

https://invisible-island.net/ncurses/terminfo.src.html#tic-m...

Surely you can imagine that it takes years or more to get people to update though, so switching to a new entry right away could make for a bad user experience.


That's not made by them and not supported by upstream, see full thread: https://lists.gnu.org/archive/html/bug-ncurses/2019-08/msg00...


True, but no matter the terminal, you still have to deal with either cmd or PowerShell (and its horrendous startup time), "active code pages", and the fact that you cannot write UTF-8 to stdout without it getting messy - you need WriteConsoleW and/or SetConsoleOutputCP but that affects the whole environment, ... cli on Windows was and is painful.


You can run WSL or nushell on Windows Terminal


PowerShell 7.4.2 starts in 1s for me.


zsh starts in 0.1s for me. 1000ms is a very noticable latency to me...


Imagine how much work we would do with those additional ms per day.


Once upon a time, I was tasked to rewrite build scripts for windows based team from .bat files to bash scripts (via a git-bash). People on that team were not exactly unix fans, in fact they resisted hard almost everything coming from us linux weirdos. But reduction in build time was so big they asked for this and embraced it with genuine love.


I bet it was a bit more than a few ms per day.


Does this mean I can run the Build Your Own Text Editor[0] tutorial in Windows without WSL? I did it in Linux a few years ago, recently heard of Crossterm[1] and was thinking trying to remake it with that, but maybe it's not necessary?

[0] https://viewsourcecode.org/snaptoken/kilo/

[1] https://github.com/crossterm-rs/crossterm


Crossterm does indeed work well under Windows Terminal. I use it indirectly via the ratatui library[1]

[1] https://github.com/ratatui-org/ratatui


Yes, although thankfully on Windows we have a modern graphical terminal, with a scripting language that took plenty of learnings from non-UNIX CLIs into account.


I really would like to have:

* text folding (https://gitlab.freedesktop.org/terminal-wg/specifications/-/..., https://stackoverflow.com/questions/52812618/ansi-escape-seq...)

* graphics (https://gitlab.freedesktop.org/terminal-wg/specifications/-/...)

I wonder a bit why all those next-gen (often HTML-based) terminal attempts have not been successful (e.g. TermKit: https://news.ycombinator.com/item?id=30517205). Probably because SSH/Mosh/Tmux concepts are difficult to take over there, and those are somewhat crucial. And text folding or graphics also would have issues w.r.t. those (see also the linked discussions).

DomTerm (https://domterm.org) supports both those features, but it does not work with Mosh+Tmux, which is my main use case.


Accept a subset of html as a tool output? So if a CLI tool wants to output a list just do:

  printf(“<li>item 1”);
And legacy tools are obviously supported as raw text since their output doesn’t start with a magic html tag.


This is a nice summary.

It would be good if there was more emphasis on graceful degradation in TUI libraries, such as falling back to ASCII art when terminal graphics are not supported.

> Some new TUI libraries, such as vaxis, are designed specifically to avoid using terminfo at all and exclusively use queries to determine feature capabilities.

I take this approach in euporie [1], having encountered the same issues with using terminfo as discussed. However, there were a few terminal features which I could not find a way to query - iTerm2 graphics support, for example.

[1] https://github.com/joouha/euporie


> Modifier keys like Ctrl and Alt complicate this situation.

This is such an understatement. Alacritty required you to manually bind keys to control characters in the config and finding the control characters escape sequences were such a nightmare I gave up and switched to foot


The wonkiness of modifier keys is a constant barrier whenever I try to do anything to make a terminal environment more ergonomic. In the end I always go back to 100% IDEs and GUI document editors.


Looks like VTE finally fixed their bug regarding Ctrl+arrows etc. I have few issues now.


It would be nice if more terminals other than xterm would support Tektronix vector graphics.

https://github.com/bennetyee/TekGraphics

Here's an emulator for those specific real terminals: https://github.com/rricharz/Tek4010


I don’t really want vector graphics, but more support for sixels would be massively helpful.


Are there any efforts to radically reimagine the entire "terminal" concept? Seems very limiting to advance further in the 21st century with those same 70's technical assumptions.

[edit] I was thinking more in terms of being unconstrained by low-res displays, inability to display anything other than text, etc.


There have been many unsuccessful such efforts. I'm only half joking if I say "the web" has been the most successful one to date. Or maybe "the browser window" is more precise, here.

The terminal is an odd, but historically understandable, mix of an API and a user interface, with the added UNIX flavor of "if everything communicates through text streams, then everything can talk to everything else" - which includes both humans and other programs. The clean way to go these days is of course to separate the two out (though on the terminal, arguably a REST endpoint is a also kind of user interface).

Powershell is a candidate for the "terminal reimagined" award in the sense that you're passing around structured objects rather than text streams internally, and so avoiding a lot of escaping/injection badness, even though the UI is a terminal window (but with a lot more autocomplete etc. features). (Personally, I hate its syntax, but that's a matter of style.)

The moment you want to go down the "UI reimagined" route while keeping as much cross-platform compatibility as possible, you end up with HTML/CSS and the layer of your choice on top of JS.


You might be interested in Arcan desktop engine ( https://arcan-fe.com ), which has a tui api for clients. It has been used to build an interesting experimental shell: https://arcan-fe.com/2022/10/15/whipping-up-a-new-shell-lash...


Yes, back at ETHZ, Xerox, TI, Genera,....

The terminal is a full graphics REPL, working in tandem with the mouse, and the languages have full access across the whole OS stack.

Even if its syntax is not loved by everyone, Powershell is the closest we have to it.


Plan 9 dropped the concept and went graphics first using a 2D engine called devdraw which you load assets into and issue draw commands. Text is a first class primitive so drawing text is pretty simple. It has a shell but it lacks cursor control so stdout/stderr is written to the window's /dev/text - think of a window as a dumb textbox. If you need vt100 emulation you run the command in vt(1) which is a graphics program which emulates a tty terminal. When you run a graphical program from the shell it opens in the same window as it should because the window is more like a canvas. The window manager Rio(1) just multiplexes the /dev/draw device so you can run any graphical program from the initial graphical terminal - a window manager not needed. You can also run rio in rio as much as you want - its devdraw all the way down. If you really wanted to preserve a TUI program you could run in vt(1) or I think there is a curses library that does draw stuff on the back end skipping the vt emulation.

In this day and age I see no reason to perpetuate the tty terminal outside of Unix as it's an obsolete concept.


that sounds like a replacement for X11/Wayland, not for the terminal.

and this of course sounds fine and nice, until you need to interop with other machines / OS-es - in which case you open vt and ssh to whatever location. Or you want your application to be callable by other machine (or via ssh or from script or cron...) - in which case you eschew all the draw commands and do good old print. And now we are back to where Unix is.


There have been many. But none of them really successful, in the sense that they were more proof-of-concept, never fully feature-complete, or just not adopted by the community.

See the list here: https://github.com/hoeck/schirm

I think the most popular was TermKit (https://github.com/unconed/TermKit, https://news.ycombinator.com/item?id=30517205).

Maybe the most successful such attempts is DomTerm (https://domterm.org).


What assumptions specifically?

Terminals, at the protocol level, assume nothing. It’s just streams of bytes which are processed then rendered.

The protocol makes no technical assumptions, which is why it has both lived so long and is such a mess


One assumption is that it's one stream of bytes (Okay, two if you count stderr) and so control and metadata has to be interleaved with content to be displayed


I think that it’s a feature of the protocol that it’s stream based. It allows for a fairly responsive interface even in rough network conditions.


No reason the transmission protocol and the application/ui protocol need to be coupled in their level of abstraction. You could send the same data plus just enough bytes to know which data goes with which stream, then a terminal app could see a content stream and a control stream. I'm not spec-ing out a concrete protocol design here, just mentioning it is a fundamental assumption that leads to much of the (lack of) ergonomics in the current approach. Namely the interleaving of control characters and display characters. Any additional functionality we want needing to be squeezed down into that representation makes doing so cumbersome.


> No reason the transmission protocol and the application/ui protocol need to be coupled in their level of abstraction

They aren’t as there isn’t any application protocol. Ncurses is the closest you’ll get. But several TUI frameworks exist (like ncurses and the charm.sh packages) which abstract the low level let’s call it a wire protocol


I know there isn't any. The topic is what innovations are possible. One possible innovation is to create such a protocol.


Sure, but I’m arguing that replacing the very minimal protocol with a heavier one would not be an innovation nor would it be useful at all as you would always need some kind of low level streaming protocol, unless you’re looking to replace pretty much all of Unix land, but that’s a whole different discussion.


I don't see why the mere existence of lower level transport details must preclude discussion of higher level developer ergonomics.


A few have kicked the tires but the fact you’d have to “boil the oceans” and rewrite every program and script in existence makes it a huge amount of work.

You’d most likely have to maintain backwards compatibility also, so you’d just create N+1 complexity, rather than simplifying things.


What do you suggest?


Different person here, but I wish there was a terminal where mouse clicks moved the text cursor (without holding any hotkey), mouse drags selected text and typing replaced the selected text, ctrl-A selected the entire current command (all the text after $ ), holding left/right arrow went WAY faster…

Basically, if you know how to competently edit an email, you shouldn't struggle in frustration at editing a big terminal command. I believe that some day there will be a terminal app you can sit a kid down in front of, and they're be able to fix a typo without any frustration or special knowledge. That day hasn't yet come.

I get the historical traditions for why terminal is the way it is, and I get why people who already know the secrets don't care to change it, but IMO it's time to move on.


> Basically, if you know how to competently edit an email, you shouldn't struggle in frustration at editing a big terminal command. I believe that some day there will be a terminal app you can sit a kid down in front of, and they're be able to fix a typo without any frustration or special knowledge. That day hasn't yet come.

In BASH, set $EDITOR to whatever you want - even a GUI editor - and then hit (by default) ctrl-x ctrl-e and it'll open the current line in your editor and when you save+close the editor the command will run.

https://unix.stackexchange.com/questions/85391/where-is-the-... seems like a decent discussion of the feature, and its footguns (it does execute whatever was in the buffer when you close the editor without confirmation), and even some talk of how to improve it a bit.


> without holding any hotkey


It's up to individual applications how they implement user input. All of this is entirely possible in modern terminal emulators - look at the micro text editor, for example.

I guess most shells (bash, zsh, etc.) keep things "traditional" for backwards compatibility reasons.


>Different person here, but I wish there was a terminal where mouse clicks moved the text cursor (without holding any hotkey)

Umm, This is a `feature` in Konsole that I ___HATE___. I'd have a file open in vim that I'm editing. I'd also have a browser window open and I'm trying to copy text over to my terminal.

Well, wherever you click in the terminal window to 'activate' it for pasting, causes the active line to jump to that place regardless of where your cursor was before. IT'S HORRIBLE!!

I'd much rather click on the window, then ctrl-v to paste my clipboard. Now I have to be extra careful and just click on the windows decoration/header, or frame. This has caused me dozens of missed-pastes.


For vim it’s something like :set nomouse to disable this. I go out of my way to disable mouse control in the terminal too


Some terminals support mouse input without any special modifier keys. I’ve used vim in iterm doing exactly that.


Maybe some kind of subscription service?


I think you're on to something, maybe let's put AI in it too?

Oh, someone beat us to the punch... https://www.warp.dev/



I'd be all over that if it didn't phone home for some reason.


I love the abstract idea of the command-line, but not it's textual implementation. The best currently available old school shell is Nushell IMHO.

What I really want is a GUI that works like a command-line, but with rich media and mouse support. Instead of 100% key input I would sometimes like to click and add/remove/filter data.

The terminal is so useful, still it's strange we tolerate up to this day all it's shortcomings.


Just a shoutout how beautiful and powerful terminal editors can be if you invest some time to know the keybindings. There are some very powerful configurations and you don't need to invest that much time for configuration if you don't want to.

Currently, if you like visuals and want super fast experience, one easy (my favorite) setup is to install neovim, nvchad.com and (as extra neovide.dev)


"This has a multitude of problems, however. The terminfo database is part of the ncurses library, and different operating systems and distributions package different versions of ncurses. This was a problem for tmux users on macOS for many years because the version of ncurses packaged with macOS was so old that it did not even include the tmux-256color terminfo entry at all!"

I build static Linux userlands from scratch using only busybox/toybox and a statically-linked GCC.

When compiling programs, e.g., tmux, I have been using netbsd-curses instead of ncurses.

After I compile tmux statically, the terminfo entries needed by tmux, i.e., needed by the curses libraries compiled into it, are contained by default in the directory $HOME/.terminfo. If nothing is found there, tmux then checks /usr/share/terminfo (or whatever path was use when compiling, e.g., /usr/local/share/terminfo).

This directory can of course exist apart from a curses library installed in a directory. In the userlands I make, there is no curses library installed in a directory after I have finished compiling all the programs.

For example, I can make a tarball of the terminfo directory from any suitable up-to-date distribution and untar it on the target computer.

Or I can make a hashed database of only the entries I need.

For me, the terminfo database is not "part of the curses library"; curses libraries provide an example terminfo database and a utility to compile terminfo entries. If the entries were truly part of the library, then one might argue they would be compiled into programs along with the library when compiling statically.

I keep a personalised copy of a terminfo database, i.e., a tarball of a terminfo directory, separate and apart from a curses library.

To be used with whatever curses library I choose. Not to suggest there is a great variety of curses libraries to choose from.


NetBSD curses actually does compile a few terminal descriptions into the library

https://ftp.netbsd.org/pub/NetBSD/NetBSD-current/src/lib/lib...

ansi

dumb

vt100

vt220

wsvt25

xterm

Unfortunately, netbsd-curses for Linux removed vt220 while almost tripling the size of the list

I prefer vt220 am not interested in 256 colors

https://raw.githubusercontent.com/sabotage-linux/netbsd-curs...

xterm

linux

xterm-256color

ansi

dumb

vt100

screen

screen-256color

tmux-256color

rxvt-unicode

rxvt-unicode-256color

st-256color

dvtm

dvtm-256color

fbpad-256


Retro-plug: here's a Teletype ASR33 simulator with sound https://www.youtube.com/watch?v=jd5oomwEBb0


Exciting to see the state of the terminal evolving! While there's a lot of legacy to contend with, efforts like Microsoft's new terminal are definitely steps in the right direction. It's a fascinating area to watch, and I'm curious to see how tools like the alternative shell and terminal emulator mentioned will shape the future of text interfaces


I use dvorak keyboard layout and the "." is a prime real estate location on the keyboard. Unfortunately, ctrl+. does not work in emacs over terminal.


I have transitioned to a category theoretical terminal interface (CTTUI). At this terminal, whenever I have an idea and I want to see it in practice, instead I think about some nice happy functors dancing in some nice happy 2-category of functors and natural transformations. Sometimes I will draw a string diagram proof that looks like a bowl of spaghetti, which makes me hungry. Very soon the desire to use a computer passes. And that is how 50 years of awful terminal tech debt is universally avoided, just by tapping the power of categories!


Do any terminals support overprinting? This could radically change what is possible in TUIs


I don't know about overprinting per se, but you can already use whatever foreground+background colors you want; how much benefit would overprinting be?


This would allow more complex shapes to be drawn by combining multiple glyphs, which would be useful for complex TUI applications (there are only a limited number of box-drawing characters available) Also, this could allow drawing more than two colours per terminal cell.


That's an interesting idea. The closest thing I can think of is using an extended box-drawing font: https://hpjansson.org/blag/2024/01/18/chafa-1-14-all-singing....

That doesn't solve the two-colors-per-cell challenge. But it makes for some very realistic terminal images.

For higher fidelity rendering, I think the next best option is supporting image protocols like sixels or Kitty inline raster images. I'm not sure how overprinting would compare to existing options.


I've just found that Mintty supports this

https://github.com/mintty/mintty/wiki/CtrlSeqs#overstrike


Oh, like making arrows by overlaying - with > or ^ with |. I could see that.


That's why I appreciate MS's efforts to improve PowerShell and Terminal.


As expected, a bunch autists arguing in here.


Where have you seen arguing?

Unless moderators purged a bunch of comments, seems like people are just discussing the history of terminals and how they could be improved going forward.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: