In some of my experimental terminal emulators I've decoupled these and it's not so hard -- there's a piece that processes command output as fast as it can, and then there's another piece that updates the screen as fast as it can, and as long as those two pieces are decoupled then even if your graphics hardware is 5fps it's still fast to cat a file.
In iTerm2's case it's because of bad decisions about a decade ago. The data model is read & written on the main thread. The main thread is solely responsible for drawing operations if you use the standard drawRect: interface on NSView. That means that a slow draw blocks it from processing more data. Other work, like reading from the socket and parsing bytes into tokens was offloaded onto other threads years ago. Unwinding all the assumptions about single-threadedness is nearly impossible. The only way to reduce the main thread's workload is to use a GPU renderer because building the data structures the GPU needs can be done off the main thread after a quick copy of the necessary state.
Apple's sorry hardware situation recently pushed me to move to Linux on a ThinkPad for most of my daily use, so I don't use iTerm2 as much now as I used to. (And, naturally, just after the return period on my ThinkPad expired, Apple rolled out a MBP with 32GB RAM X()
Your software made my life better every day for years. It was the first program I downloaded and installed after LittleSnitch when I set up a new Mac. Thank you.
BTW, I'd love to help to add overline support. Mind if I open an issue and ask for some pointers?
If only this were true. There are plenty of games where the frame rate is capped at 60hz because the physics engine is coupled to it. If you force a framerate increase via .ini tweaks or something, you get really weird physics. I think Skyrim had this problem. Also, there are examples of where changing the network tick rate  can affect the physics engines.
To me, that's bad software design. 40 years ago, developers used to sync their logic to the refresh cycles of the target machine's display. That's amazing, but also something that should not happen in 2018 (or 2010 or 1998).
For the younglings: it was a physical button to underclock the CPU on old PCs.
Can't a virtual world be built with the same principles?
The only place it might cause issues is if you need it to be deterministic, e.g. for multiplayer.
Obviously for most models it is simple to adapt to variable simulation periods without instability, either by microstepping (which commonly occurs anyway because the natural simulation freq is much faster than 60Hz), or because the process is just solving a stable DE or doing basic physics over a different period.
For debugging it is often much easier to run physics at a fixed rate. You do get different results at different rates due to numerical precision issues and error accumulation.
Most games can simply "drop frames", when renderer isn't up to speed with engine. The result is minor loss of visual fidelity, which usually isn't too bad, unless FPS are horribly low. Things are more tricky for terminals, because 1) they need to maintain scrollback history 2) their output is more blocky, so "dropping frames" leads to horribly-looking ASCII animations.
There are some terminals, that do start dropping output, when rendered can't keep up. Others (for example, xterm) do refuse to do that by default, because they assume, that underlying program may do better job at handling low framerate, than general-purpose terminal emulator.
Either way, if you want to achieve fluid, visually pleasing rendering, you renderer should be as fast as possible.
Unless you're already using Vulkan/Metal your render thread will most certainly be lockstep with the GPU and that will block your CPU throughput.
You haven't described CPU/GPU independence at all. You're just talking about sacrificing the FPS for CPU perf.
You could add a buffer but if you can't drop frames that will only smooth out vsync wait.
It’s not trivial to implement it in an existing codebase. Quoting from ITerm2’s Subpixel Antialiasing document:
> iTerm2’s design burdens the main thread with handling input from the TTY. A better design would perform only drawing and other UI activity on the main thread, but it’s too hard to change as there are thousands of implicit assumptions baked in to the code.
I really wish iterm was cross platform. I’ve realized when switching between Mac and Linux how important the terminal is to my development and quality of life.
And if you push the settings right, you can use it to convince the procurement department that you need a new monitor.
Cool, it even works over SSH! Wait, what? This lets a remote server take over your workstation.
However, the feature I price the most is quick responsiveness when I type. For that, I still find xterm extremely good.
For me, one of the takeaways from this is that time taken to cat a large file is a poor benchmark of a terminal emulator.
This article also includes a nice demonstration of yet another application of Rust - as an easily reproducible and highly reliable way to load your machine so hard it interferes with your terminal.
We’re not talking about wattage, but energy consumption. I.e. a single core working at its max for 5 seconds uses more power than three cores working for one second. It’s a question of efficiency, not how many cores are working at once.
Here is an example of GPU rendering with support of Acrylic(Windows) / Vibrancy (MacOS) so popular in terminal implementations:
There's also a lot of preprocessing to do before sending work to the GPU. For example, computing the color of each cell is a very complex algorithm because foreground and background colors interact with each other and also determine the font. For example, you can enable thinner strokes only when it's light text on a dark background. Work can't be sent to the GPU until the text is rendered into a texture, which means colors must be known. There's room for optimization here for common cases, but I haven't taken that on yet. My goal for 3.2 is for it to be an improvement over the legacy renderer and a solid foundation for iteration in the future.
In Terminal, I can cat a 1.5M line file in about 5 seconds, it takes about twice as long in the new iTerm beta.
But iTerm is certainly updating the screen more often than Terminal (it looks smoother at least), so maybe that's because Terminal is doing a better job of allowing the screen buffer to update independently of the drawing thread.
It's a minor thing, but may be something to consider.
Occasionally, but less frequently, you can run into a similar symptoms just running build scripts.
"Important: It is highly recommended that you run the make.sh script in either a very fast terminal such as xterm (the GNOME terminal and the OS X terminal are too slow) or that you run it in a detached GNU screen session (use C-a d to detach the session and screen -r to resume it). The SBCL compile produces lots of output and your system's terminal program will likely slow down the compile in a drastic manner."
I've had this handle since around 1998, but other people started grabbing it in other forums after that. I'm not the twitter ninkendo nor the steam ninkendo either.
I only have this handle without trailing numbers on here, Ars Technica's forums, and gmail. (Formerly reddit too, but I deleted my account there when the site became a bit too toxic.)
I’ve found the performance of iTerm in particular to be far inferior to builtin, especially when the buffer grows large. iTerm even has a fixed buffer size by default and still has worse performance.
Alacritty is supposed to be fast, but then doesn’t support any scrolling, has a poor window resizing flow, and doesn’t have tabs. The suggestion is to use tmux for all that, but then Alacritty+tmux is slower than builtin.
- “performance” above means smoothness of scrolling and general “perceived performance” type things. Haven’t actually tried any real tests.
* tmux integration. Whether I'm in a remote server or my own machine, just use -CC and I can now use tmux with the same keybindings I would use without tmux.
* Watch for the completion of a command. Especially useful if you happen to be using a language with a slow compiler like Haskell or Rust. Plays a nice ding when the command is done. (This may require shell integration).
* Even if you forgot to run your command with time, iTerm can still tell you the wall clock time of your command. (Shell integration too).
And then just small things that the builtin Terminal doesn't have:
* focus follows mouse
* smart selection (for example it could detect a URL and you can select the whole URL, then command-click to open)
* copy on select
defaults write com.apple.Terminal FocusFollowsMouse -bool true
I use it mostly when working through SSH on the go and with slow connections.
CMD+O brings up the profile switcher, type the tag I want and hit enter.
Profiles let you configure cwd, commands to run (virtualenv activation), theme colours, and a bunch of other things.
My “prod” profile ssh’s into a management box, sets the background to pink, and puts a “PROD” badge in the top right corner.
For me it's simply that I couldn't get custom colors (e.g., the Solarized Dark theme) to display correctly with built-in Terminal. I haven't tried in 5+ years though -- maybe it's fixed now, and/or maybe I was doing something wrong at the time. But iTerm "just worked", and I never noticed performance issues so once I switched I never looked back. shrug
Alacrity is just an experiment at the time.
For example, on double click the Terminal.app will select `path.txt` from `some/long/path.txt`. iTerm will select the whole thing and copy it to the clipboard.
It was nice, though.
Heck, I about dumped OS X when Apple changed the default application opening method. It used to be that when an application was loading and you switched to another window, the application opened behind what you were now working on. Or if you were working on something and another application in the background opened a window, you only got the jumping icon to let you know.
Today, applications can hijack focus pretty much whenever they want, aping one of Windows' most annoying UI features.
I use tmux coupled with that, btw.
I usually enable the readline style keybindings system wide (move character forward/backward, move word forward/backward) using a custom DefaultKeyBinding.dict — but then typically have to figure out workarounds for programmer type applications which try to resupply these shortcuts expecting the keys to be bound to random Unicode characters rather than already rebound to the actions ... as I recall — with my customization the keybindings don’t work by default in iterm2 and I had to go in and change something to get the correct behavior (fortunately there is an option). Many programmery type applications have the same issue (fortunately it’s mkstly possible to customize them all).
So it’s stjll possible to get the emacs keybindings at all text inswertion points systemwide — but it’s a hassle... In the glory days of OS X — pre 10.5, none of this was necessary — option-f option-b etc were just bound to the right thing by default rather than printing pointless Unicode characters ... I’m still annoyed by that change and by the fact that a bunch of complicated configurations are required to get the right behavior nowadays ...
I don't really miss any iTerm features. Have to look up what's new!
That being said, iTerm2 is plenty fast enough for my purposes as it is today.
However, I find that zsh + ohmyzsh works just as well. After typing the beginning of a command you can use the arrow keys to cycle through all commands in the history with the same beginning.
A third method for efficient history search is ctrl+r. Just press ctr+r and start typing part of the command. Then you press enter to execute or ctr+r to cycle through the history.
But after that, I'd suggest giving fzf (https://github.com/junegunn/fzf) a try.
M-/ (opt/alt + /)
If you're nervous about running shell commands here they're the same as the installation on the site if you care to check:
sh -c "$(curl -fsSL https://raw.githubusercontent.com/robbyrussell/oh-my-zsh/master/tools/install.sh)"
git clone https://github.com/zsh-users/zsh-autosuggestions $ZSH_CUSTOM/plugins/zsh-autosuggestions
Combine this with using control+p to cycle through previous commands means you can be a wizard at commands quickly
1 - https://github.com/robbyrussell/oh-my-zsh
2 - https://github.com/zsh-users/zsh-autosuggestions
I use fish for scripting but run zsh for terminal. (fish had couple weirdness in day to day terminal use and zsh could do pretty much all fish could do with zplug.)
while bashims may give additional features, readability and portability suffers. I much prefer scripts using a posixish #!/bin/sh and a strict subset of shell syntax. something like pdfx.
The FPS meter only works with the Metal renderer, so I can't directly compare it and the CPU renderer, but just from a subjective perspective, I notice slightly worse performance from the CPU renderer while scrolling, but nothing major.
Beyond that, though, how often are you scrolling, and how buttery smooth does that have to be to be a good experience? While it does seem smoother side-by-side, I'm not blown away by the difference.
I have set triggers that color the text / give out osx notifications if there is a string somewhere like Error/Exception.
It basically brings terminal vim on a par with MacVim in rendering speed.
3.2.x will be focused on rendering improvements. The full change log is here: https://iterm2.com/appcasts/testing_changes3.txt
3.3, which is currently under development, is adding a Python scripting API (https://iterm2.com/python-api/) and fixing tab and window titles to make sense, plus adding a status bar.
Feature requests that are candidates for 3.4 are here: https://gitlab.com/gnachman/iterm2/milestones/8
If you have something in mind file an issue. Y'all are my PMs.
There is already support for ligatures (prefs>profiles>text>enable ligatures). The Metal renderer does not support them because that's an extra level of complexity I can't handle in the first release. It would require constructing CTLineRefs which is already the slowest part of drawing. Subpixel antialiasing also makes this really difficult, but since that's going away in Mojave I'll revisit it.
Future performance enhancements that I have in mind are:
- Faster tokenization (including revisiting SIMD)
- Mojave-specific optimizations (removing subpixel AA and its knock-on consequences)
- Move per-frame setup work into a compute shader. This is tricky because some people have CPUs that are relatively faster than their GPUs (thanks Intel!)
- Reuse the preceding frame's rendered texture when updating less than a full frame
- Move work off the main thread where possible
Looking forward to Mojave dropping support for subpixel-antialiasing and maybe iTerm supporting Ligatures with the new drawing engine someday.
Not about setting ligatures for print.
So the whole argument is kind of moot. Having ligatures in console vi is not what made the Mac "historically be far ahead of other platform". In fact for the first 15 years of its life the Mac didn't even have a console.
Even in the desktop GUI program world, where ligatures in programming IDEs and editors are a possibility, they're still an extremely fringe endeavor.
That said there may? be languages other than English where ligatures are more than decorative, and would be more useful to have in a terminal.
They could start by not mimicking 50+ year old hardware terminals down to arcane (and performance killing) details, supporting better and more powerful autocomplete and similar interactions, inline images, full colors, and so on...
And I might be wrong, but in that world, ligatures and transparent terminals are not really as important as rendering speed, latency, and/or less CPU load for a terminal emulator.
In fact, even where available, ligatures are a fringe option, adopted by very few people.
(I'm always talking about programming editors. Though even in print, their main domain, they're hardly mainstream these days).
Why is the thing I don't notice objectively more important than my convenience and happiness?
1. make the app full-screen;
2. make the app constantly vary the screen output as often as it can;
3. capture the HDMI output of your computer with a hardware video capture box;
4. use some scriptable video software to calculate, for the captured video, how many distinct frames (i.e. changes in content from the previous frame) it contains per second, on average.
 you could also substitute a high-speed camera pointed at your monitor (necessary when the limiting factor is monitor refresh rates.)
 Conveniently, if the capture box emits video in a format with inter-frame compression, you can do the new-frame-counting at a low level, without decompressing the video. You parse through the video stream, discarding the “no change” I-frame chunks, and then count the remaining frame chunks grouped by the start-timecode field cast to seconds. Which is, I assume, what OS file-indexing services do to calculate a video’s FPS, for those that do. This answer is slightly high in the case that you’ve got no-change keyframes in the stream, but it should average out to essentially the same number in any video longer than a few seconds.
- killer tmux integration (tmux sessions on a remote machine behave like normal tabbed panes, no idiotic prefix-key, broken scrolling and selection etc. etc.)
- the ability to rebind just left or right command keys to meta (and leave the other one working for "normal" copy and paste etc.)
- instant replay and generally very sophisticated output and output history handling (e.g. it's trivial to copy the last command's output or autocomplete from output history, all via keyboard shortcuts).
- command click to open e.g. files output by ls (which requires keeping track of cwd at output time)
- input broadcasting
- simultaneous search across all terminals
- mature and polished (renders unicode, including line drawing characters reliably, deals with large output history and many tabs etc. just fine)
- image output
Alacrity (whilst interesting) has precisely zero of these features. In fact I'd already be delighted to find a single linux terminal that has tmux integration.
On linux, few terminals support this, and the only one that seemed really usable was roxterm, which was increasingly difficult to install as of Ubuntu 17.10 when I last tried, due to the ROX collection being abandoned. It looks like there's new activity in a github repo this year, but I haven't tried it.
I built a very minimal side-tabs plugin for Zeit Hyper, but updates broke it so regularly that I gave up (the plugin was never great, anyway, and avoiding collisions with other tab-affecting plugins was difficult).
There are several other terminals on linux that claim to support side-tabs, but they all have some issue or another (my "favorite" is when they "support" side-tabs by literally rotating the tab bar so that the text of the bar is sideways, distorted, and still shrinks as more tabs are used, which defeats the point of using side-tabs in the first place!).
The developer was working on the fix, but I have not seen a fix announced in the release notes yet.
> When your device is not plugged into a charger. This is controlled by Prefs > Advanced > Disable Metal renderer when not connected to power.
Why is the GPU renderer, by default, disabled unless plugged in? I’d assume the GPU renderer uses less power...
The Dynamic Profiles feature is fantastic -- being able to version control a subset of the configuration document is exactly what I wanted to be able to do. But I'd like to understand the settings interface/model/schema better.
Especially when I would want to test it right now, sitting here on my bed. :)
But the old annoying bug to block indefinitely on an accidental wrong Ctrl-Enter keystroke is still there, since about a year. The regexp detection is also pretty unstable and can block the window. Either the Semantic History or url detection.
Seems like they are still using GitHub but use GitLab for the wiki?
VSCode / Hyper 2 are GPU accellerated and are really fast once loaded. Because of their use of Electron they are not excactly fast to load, though.
Actually, try it anyway as when I first ran into this fix for such things, I didn't have what I thought was a mouse connected, but actually some non-mouse USB device reported to macOS that it also had a mouse on board for some reason.
I may try this iTerm beta just to see if it fixed it.