You can get any parameter of the previous command by using "Esc, <number>, Esc, .". If you repeat pressing "Esc, ." you get the n-th parameter from the command before the previous one, etc.
It's actually Meta, emulated by an ESC character by the terminal. OSX tip: check "use Option as Meta" in Terminal's preferences, and you can keep Option pressed while you mash ".".
I use `!:p` to the same effect. Which has the benefit that you can insert a history number between the bang! and the colon: ( set PS1 to include the command number if you do this a lot ).
I use this all the time. In fact it was one of the first aliases i made when i switched to a UNIX system. Though I named mine `greph` (for grep history). Absolutely invaluable
It always feels more logical to me to just type "Ctrl-P Ctrl-A sudo" (previous history line, beginning of line, add missing sudo) than typing "sudo !!" or "sudo !!<tab>" if you need to edit the substitution.
Same thing for the last argument, it's easily edited if you know the right shortcuts: Ctrl-P Alt-B Ctrl-U. (Works for bash, for zsh you need http://stackoverflow.com/q/3483604/414272). That said, I didn't know about Alt-.
Bash substitutions are done prior to execution so your history stays intact. Older open source projects are just amazing for how complete their feature set is.
Back in the late 90s when I first moved to Debian from Windows, growing up with a DOS CLI made the transition far easier to me (and having some FTP exposure didn't hurt either).
There's no good way to refer to the entire DOS/Windows command shell lineage without confusion, but that's what I meant. I don't technically use DOS at all anymore. I think "that system" is easier to understand in it's entirety and find your way around than Unix, although it has been getting progressively more complex for years. I actually use OS X most of the time now (plus Linux on servers and Windows for 3D stuff).
> I think [DOS] is easier to understand in it's entirety and find your way around than Unix
Yikes, I recently had to do a bit of development on a windows box and I found the command-line tools (not to mention CMD itself, which seems to have stopped development in 1993) to be absolutely awful. I could barely survive without Cygwin, git bash, etc giving me some semblance of a functional shell setup. I guess it's different strokes for different folks.
I agree: awful tools and batch language is horrible (and largely non-portable between different versions). Getting complicated stuff to work can be tricky due to the lack of essential commands. For instance, there's no way to reliably get an ISO formatted datestamp without a 3rd party utility (although I think PowerShell supports this). Lots of odd little things like that simply don't work.
But the filesystem layout and set of built-in commands is pretty easy to understand fully. It's generally pretty easy to find things. In that sense Unix is more complex (but also more sensible, usually).
> But the filesystem layout and set of built-in commands is pretty easy to understand fully. It's generally pretty easy to find things.
On the face of it, the basic directories are straightforward (Programs -> \Progra~1, User files -> \Docume~1, System files -> \Window~1, Very System files -> \Window~1\System32), but I don't recall having an easy time finding things (speaking from memories of my XP days).
Where would you look for a config file of some application? Maybe it'll be in \Progra~1. Or $user\LocalSettings. Or ApplicationData. Maybe in the Registry? (Don't get me started about the registry!)
Where would you look for log files? Does the Event Viewer show everything nowadays, or do you still have to hunt for logfiles in the same fashion as config files?
On *nix, I know that my configs are under /etc/ (global) and ~/.{program name}/ (user-specific), and my logs are under /var/log/. I don't know what every single directory is, but I know where to look for things when I need to.
What really gets me is that MS almost fixed that with powershell -- and then made the defaults policies so restrictive that you need to go through hoops to actually be able to use the thing. I do get that it's "really powerful and lets not create another set of visual basic script viruses" -- but the UI is just horrible. Maybe it's better in 8, but last I checked I think you had to go into the policy editor and choose between "safe - no powershell for you", "safe-ish - only signed powershell (still no powershell for you) and: open season on shooting your feet off: powershell always, everywhere, all the time.
Don't get me wrong, I still prefer living with bash and a full GNU system -- it's just sad about what MS did (or didn't) do with their command line tools since 2003ish and on.
"Contains locally installed software. Originated in System V, which has a package manager that installs software to this directory (one subdirectory per package)."
Solaris made extensive use of /opt, that flowed into some Linux packagers. It then fell out of favor as everyone put everything in /usr
You know what I hate? My history isn't aggregated in real time across all my Mac OS X terminal windows.
You know what else I hate? Typing in long commands in the Mac OS X terminal and then them wrapping weirdly. Especially when I hit the up arrow to go back in my terminal history.
history -a # append history lines from this session to the history file.
#History file may contain history from other terminals not in this one so:
history -c # clear [in-memory] history list deleting all of the entries.
history -r # read the history file and append the contents to the history list instead.
I've heard that -n can be problematic which is why -c then -r is used.
> You know what else I hate? Typing in long commands in the Mac OS X terminal and then them wrapping weirdly.
Yeah, keeping ctrl pressed in and pressing x followed by e (CTRL + x e) will open up the current line in $EDITOR and when you edit and save, replaces the current line with what you entered in your editor. Really a killer feature.
In Linux with bash (and probably other shells that are bash-compatible), you can use vi to edit any of the commands in your command history, and then execute the edited version.
Do this at the command prompt, or once in your ~/.bash_profile to make it permanent:
set -o vi
After that, you can search for any of the commands in your history, edit it, and then execute the edited command, by doing this:
At the prompt, type Esc once to get into vi command mode. Then you can press the k key repeatedly to scroll up through the command history, or (often easier) use the ? (search backward) vi command to search for a pattern to find a specific command. Once found, press v to edit it in a temp file. Then when you save and quit, the edited command gets executed.
The same technique works with emacs as the editor instead of vi, if you don't give the 'set -o vi' command, because the default editor for command line history is emacs. Also, if you have run 'set -o vi', you can switch the editor for commands back to emacs with 'set -o emacs'.
The 'set -o <editor>' bit sets the readline editing environment to be similar to vi. It can be set to emacs.
C-xC-e (edit-and-execute-command) invokes the editor specified by $VISUAL, $EDITOR, or emacs, in that order. You could set it to scrivner if you wanted to (though I'm not sure that would necessarily work on exit).
I've tested with VISUAL set to nedit, from which I then changed it to uptime. Now I get loadavg when I want to edit my command line ;-)
Yes, that is also a solution. However, you cannot use vim so if you're used to vim, it might feel limited without certain commands. You're also missing eventual plugins.
function fancyPrompt {
local bgBlue="\[\033[48;5;31m\]"
local fgBlue="\[\033[38;5;31m\]"
local fgWhite="\[\033[38;5;231m\]"
local bgDarkBlue="\[\033[48;5;24m\]"
local fgDarkBlue="\[\033[38;5;24m\]"
local bgDarkGray="\[\033[48;5;237m\]"
local bgLightGray="\[\033[48;5;245m\]"
local fgLightGray="\[\033[38;5;245m\]"
local colorClear="\[\033[0m"
local branch
local branch_symbol="\[\] "
if branch=$( { git symbolic-ref --quiet HEAD || git rev-parse --short HEAD; } 2>/dev/null ); then
branch=${branch##*/}
export PS1="${bgBlue}${fgWhite}\h${colorClear}${fgBlue}${bgDarkBlue}\[\] ${fgWhite}\w${bgLightGray}${fgDarkBlue}\[\] ${fgWhite}${branch_symbol}${branch}${fgLightGray}${bgDarkGray}\[\] ${colorClear}"
else
export PS1="${bgBlue}${fgWhite}\h${colorClear}${fgBlue}${bgDarkBlue}\[\] ${fgWhite}\w${bgDarkGray}${fgDarkBlue}\[\] ${colorClear}"
fi
}
That's quite pretty. How do you get the triangles?
I've been loving a prompt which color codes git branches (which, if I understand right, would be built-in if I used zsh instead of bash) [0], though I have to edit the last line in order to get my history appendation working as well.
> Typing in long commands in the Mac OS X terminal and then them wrapping weirdly.
For all people who dont use Mac OS X:
This happens because your PS1 is wrongly set and bash cant calculate correctly the length left of your line. Try it out, by going back to default with no colors and crap and see how long it goes.
For mac os x users. The above wont help, dont even try it.
I prefer to keep mine separate, but I always use specific terminals / screen sessions for (mostly) the same type of work (admin, specific projects, web dev, etc.) so each have their own history, as well as keeping a per-directory record.
So far over 130000 command lines (with timestamp & cwd) for past 2.5 years, just on my laptop.
You know what I hate? My history isn't aggregated in real time across all my Mac OS X terminal windows.
I used to feel this way. Then one day I figured out how to enable this in Bash. Resulted in a confusing mess. Turns out you most likely want terminal sessions to be distinct until you end them.
sh, the Bourne shell mode of pdksh? Isn't that a little too limited? pdksh (/bin/ksh) though is great, yes.
I'm in general a big fan of the OpenBSD userland: straightforward and very, very well-documented. Real manpages, no religious "The full documentation for xyz is maintained as a TeXinfo manual" crap.
A well-conceived featureset and good docs are key to properly learning a platform. OpenBSD taught me a lot about Unix. If you later realize that Bash has some feature that you actually need, you can still install it.
Also worth noting that native borne shell was always slightly slower than korn shell. Though been a long time since I checked that one out, though did on few systems and found it to be so. But there again csh was also about then, distant memory in for many that one.
Also anything that will run script wise on bourne will run as is under korn.
Though you may well find that it is historical, which one a user picks shell wise. People who started or worked a lot with Sun systems will be csh fans. Old vets more inclided to bourne, though very few. As for korn shell, that would be mostly down to AIX systems as that is the default upon them as with the other systems mentions, default wise. Then for Linux you will find a bias towards bash.
Least that is what I have observed.
Though history does give us some intersting trends and the awk, perl, python transition and preference will also mostly be down to when somebody got into unix as a whole. Again old school, awk. Old, perl and not so long ago the python brigade.
But that is just a rule in thumb and more helpful in explaining why there is a solo perl script when everything else done in xyzzy type encounters.
As for OpenBSD, good choice, I prefer it due to the file system layout and more akin to old school unix unlike Linux which is `creative` more than not in choices, so feels less at home.
Still back in the early days we had AT&T and Berkly BSD flavours and with that the ps command, oh the fun and games.
But least thinks a little bit more common across flavours than before and yet still each has there own quirks.
Yes that's the one. It's perfectly fine. I found that if you have to push it past the limit it's probably the wrong tool for the job and it's time to use Python or Perl or pick something off the shelf and think about more structure to whatever you're doing.
Shell scripts really don't scale development-wise over time.
Definitely agree with the rest of the points though.
No! In emacs ctrl-r searches your command history. It happens that bash use emacs mode by default. All other emacs basic commands work too: ctrl-p, ctrl-n, ctrl-f etc...
If you are a vim user, and especially if you love venting how much you hate emacs, then please add this to your bashrc:
The -n flag tells ssh to only make a connection, without ever running a shell. This means it even works when you don't have shell access (say, with a command= entry in .authorized_keys).
Also, I cannot recommend envoy enough in lieu of ssh-agent, if you're not using a GUI ssh agent already (e.g OSX keychain or GNOME keyring)
Last, I guarantee you don't want "$@" in those aliases, but either "$*" or $@ (certainly the latter).
I think you meant -N, not -n. From the man page[1]:
-N Do not execute a remote command. This is useful for just for‐
warding ports (protocol version 2 only).
-n Redirects stdin from /dev/null (actually, prevents reading from
stdin). This must be used when ssh is run in the background.
I usually visit these threads knowing that I will pick up something useful to add to the toolbox... thank you for 'envoy' - I think it will streamline my ssh key management.
I wonder if I can convince the author of this to put it up in a Git repo so everyone can contribute. It probably needs the tiny URLs removed (IME, they're more likely to break than the original site)
I love fish and have happily been using it for years. Every time it comes up here, though, someone inevitably will complain about compatibility - and I will admit that RVM, for instance, has definitely caused me problems with fish in the past. I guess it depends quite a lot on your particular usage and requirements.
I never got RVM working with fish; unsurprising, seeing as it’s 20k lines of bash. rbenv works well though (with one additional conf line), and chruby was working on support last time I checked.
I started using RVM when it was the only (possibly well-known?) game in town, and never switched until I started using fish and found that it didn’t work.
"- Compile your own version of 'screen' from the git sources. Most versions have a slow scrolling on a vertical split or even no vertical split at all"
I suggest bashmarks http://www.huyng.com/projects/bashmarks/ to bookmark directories and jump there with a quick command.
I also like to alias ..="cd .." and alias ...="cd ../.." to quickly navigate up.
It doesn't mention either that you can search to the other direction by pressing ctrl-s (usually it also sends the stop signal which should be removed by "stty -ixon"). Very useful if you go past the entry you are looking for.
I use C-r constantly. But sometimes I need to find a command, and then edit it before running it. What is the proper way to exit "search mode" and go into the normal "edit" mode?
ISTR pressing one of those keys would obliterate the part of my line where I was writing, putting in ^B characters.
I can't replicate it now on my test box; it works exactly the way you say it should. It may be a terminal issue; I'm currently on Windows using cygwin to ssh to Linux.
Ctrl-p in emacs takes you to the previous line. Since readline uses emcs-style notation by default, Ctrl-p takes you to the previous command in history (and Ctrl-n to the next). I don't know about zsh though.
Also, other emacs keys work like back: ctrl-b, meta-b, forward: ctrl-f, meta-f, start of line: ctrl-a, end of line: ctrl-e etc.
A really handy one a colleague showed me yesterday was the /dev/fd/0|1|2 files, which are stdin/out/err respectively.
Means you can use that file for utils that expect a file only.
E.g echo "This would be contents of file" | someCommand /dev/fd/0
You can even access the file descriptors of other processes with /proc/${pid}/fd/; which is handy for (say) un-deleting a file that a process still has open.
Another handy one is "process substitution" which lets you do the same thing with multiple streams by creating new file descriptors, and automatically turning them into /dev/fd/* paths:
someCommand <(echo "contents of file 1") <(echo "contents of file 2")
which is somewhat explained by:
$ echo someCommand <(echo "contents of file 1") <(echo "contents of file 2")
someCommand /dev/fd/63 /dev/fd/62
"process substitution" is one of the new useful things I learned this year (switched to bash from csh over 20 years ago).
/proc/${pid}/fd/ is very useful for forensic purposes when you find some malware running that deleted itself and/or config files but still has a handle open to them.
Use dcfldd instead of dd if you want to see how far your dd operation is progressing...
dcfldd if=[infile] of=[outfile] sizeprobe=if
So. Much. Better.
dcfldd also has many other useful options - read the man pages - and note that you'll have to install it from your linux distro's package repo beforehand.
You can also send SIGUSR1 to dd for progress info. From the manpage:
Sending a USR1 signal to a running 'dd' process makes it print I/O statistics to standard error and then resume copying.
$ dd if=/dev/zero of=/dev/null& pid=$!
$ kill -USR1 $pid; sleep 1; kill $pid
18335302+0 records in 18335302+0 records out 9387674624 bytes (9.4 GB) copied, 34.6279 seconds, 271 MB/s
>Don't know where to start? SMB is usually better than NFS for most cases.
Not sure why he feels SMB is better, because in my experience, NFS usually works more smoothly and is easier to setup than SMB. But then, I don't usually use windows.
One that I needed to use a lot in the old days, and still occasionally need:
If your font is all messed up, echo C-v C-o. C-v puts you into a literal mode, and C-o gets you back into the normal character set. (You probably got there by emitting a C-p at some point.)
The only problem with -W is that it is not available everywhere yet: RHEL6 defaults to OpenSSH 5.3, while -W was introduced in 5.4. Thankfully EL7 comes with OpenSSH 6.4 (and a kernel newer than 2.6.32!).
Speaking of man pages, in earlier versions of Unix and Linux, I used to want to redirect the output of many man commands to files for later reading, e.g. if working on C, say I wanted to read 'man ioctl', 'man stdio', 'man signal', etc. But the man output had {n|t}roff formatting characters in it, for print output, which used to mess up vi. So I used to use this script I wrote, called m:
# Put this script in a directory in your PATH.
# m: pipe man output through col and save to file(s).
mkdir -p ~/man # Only creates the dir first time, including all dirs in the path.
for name: # in $* is implied
man $name | col -bx > ~/man/$name.m
done
Do a "chmod u+x m" to be able to run it.
Then run it like this:
m ioctl stdio signal # or any other commands
Then:
pushd ~/man; view ioctl.m stdio.m signal.m; popd
to read those pages, now stripped of formatting characters.
zsh has a simple builtin `r` (which is just an alias for `fc -e -`). It lets you edit the previous command with search and replace. I use it for brew and git a lot.
Very handy if you do something silly like cat a binary file without piping it into strings -a
This restores the terminal settings and sanity restored. Note you will often find yourself typing without seeing anything, least until you enter and execute the sane.
Also asvise the following usage
ctrl+c a couple of times to clear anything out of the input buffer queued up.
stty -sane
all good from here onwards and good time to play with od and remember the strings -a pipe next time.
In bash, "ESC" then "." fetches the last parameter of the previous command. It's invaluable (same as typing "!$" but you get to see and edit it)