Then, I have a script called "findcmd" that simply greps my .unique_bash file.
for var in "$@"
do
cmd="| grep \"$var\" $cmd"
done
cmd="cat ~/.unique_bash $cmd"
eval $cmd
In the end, when I need to figure something out, I head to the internet. Those commands are then captured for posterity in the .unique_bash file. If I want to know how to post a JSON file to an endpoint using Curl, then
$ findcmd curl json
And all those curl commands show up.
It won't let me do something I don't know, but it make my memory much, much longer.
And, yea, I admit there have been instances where I've completely forgotten the command, and had to head back to the web. But when I've done that, I can hit my history and see how I used it.
If, sometimes, a "bad command", a command done wrong, too many of the same thing, just lingers, I can go edit it out. But most of the time I don't bother.
You might enjoy the fzf (fuzzy finder) tool configured for your shell's history search keybinding (most likely Ctrl+R). I can't elaborate at the moment, but essentially you type Ctrl+R, start typing roughly what you want and fzf searches your shell history based on what you typed. It works extremely well for me. Based on what you described, fzf Ctrl+R should be your workflow on steroids.
Nice work, I wanted to add how I deal with this.
first my .bashrc Is setup to append and history is set to 100,000 lines.
Most importantly I have an alias set up for the letter h , that greps, case insensitive .bash_history. ( grep -i $1 .bash_history ).
So typing in the cli:
# h awk
gives me all my awk commands.
And second , as a global history search across all my VMs/instances, (as I already make extensive use of splunk and splunks universal forwarder log forwarding app/tool) , all vms have their splunk universal forwarder service set to send any updates to .Bash_history to my central spunk server. thus I’m able to globally search the bash history from any VM globally, going back forever (I search that via the splunk web gui i mean).
All works great!
Thank you! You just improved my workflow and that does not happen every day. A spin I think I might add is to output all commands that do not result in standard error to a separate history file. .bash_history_no_error or something like that.
(Unrelated to original post) That only helps if you know the keywords curl and json. My actual problem in the past has repeatedly been that I a) forget how the graphical file manager on my system is called because I use it a couple of times per year (dolphi, konqueror, thunar, whatever...) and b) I use i3 or xmonad and don't have a "start menu" to just browse.
I think I should just put a symlink called explorer.exe in my ~/bin/ and then I know where to look ;)
Anytime I sign into a system I have not signed into for a long time. The first thing I do is save the history file before I start working. Many times it has saved the day when you get stuck: where did I store that config? what was the command to restart the server?
Simply, like many, I have bash set up to record commands in .bash_history. I have it flush all the time, so .bash_history is always current.
Next, I have a simple cron job running every minute. Then, I have a script called "findcmd" that simply greps my .unique_bash file. In the end, when I need to figure something out, I head to the internet. Those commands are then captured for posterity in the .unique_bash file. If I want to know how to post a JSON file to an endpoint using Curl, then And all those curl commands show up.It won't let me do something I don't know, but it make my memory much, much longer.
And, yea, I admit there have been instances where I've completely forgotten the command, and had to head back to the web. But when I've done that, I can hit my history and see how I used it.
If, sometimes, a "bad command", a command done wrong, too many of the same thing, just lingers, I can go edit it out. But most of the time I don't bother.