I frequently set up automatic backups of network appliance configurations, route tables, and traceroute results. This is useful for diagnosing sudden problems, as well as taking before and after snapshots when making changes.
I collect interesting data, such as kernel version and versions of various services, from hosted web servers my clients run.
I collect data at regular intervals when investigating a sporadic, chronic trouble event to help me identify correlations and test hypotheses.
I collect BGP and traceroute data at regular intervals from route servers to identify important changes and trends in Internet routing for specific prefixes.
I used to collect IDS signatures in a git repo to track updates, but I eventually moved this to GitLab.
I regularly run a watchdog script to check for, alert on, and attempt to mitigate low disk space and other such conditions.
Hope this helps!
2. I make a run.sh script in every project that I add to my git exclude, so whether it's a Go app, Python app, Node app, makefiles, Docker, ..., no matter what I can go into any project and run this script to start it. Then I add an alias in my bash profile so I just type "r" to ./run.sh. For more complicated apps, I generalize this to have a build.sh, deploy.sh, etc.
I wonder if you could further play around by having your command aliases default to figuring out what sort of project you're in and either doing the nearest 'generic' method of completing that command for that type of project or perhaps even generating a skeleton file and opening up your $EDITOR to edit to fit the project you're in.
make -f Makefile2 build ...
So I wrote two shell scripts (one for each CMS) that could be easily customized for each site. I ran it by hand whenever we did a deployment, and also set it to run every hour in crontab. That way even if someone changed the permissions away from safe (by accident or maliciously), they would revert to safe pretty quickly.
I've since moved all those sites to WP Engine and Pantheon, so now the directory permissions are someone else's problem. :-)
Relay is attached to a 500w heater and 4 old computer fans blowing the hot air at my feet.
/usr/bin/gcalcli --calendar '[my calendar]' --detail_location remind 10 '/path/to/alertscript.sh "%s"'
where alertscript.sh just calls kdialog with the message describing the meeting.
bank csv to sqlite: stores all of my banking transactions in sqlite.
high-tide-calendar-beach: Uses XTide to create a csv of high/low tide times for a local beach and then converts it to an ics calendar that gets copied to a server where my calendar apps read it. (The local beach is inaccessible at high hide)
XTide is a pretty cool concept. So, I'm linking it here.
One example I deal with often is interacting with third party systems. I had scripts to simulate them for ages but had to copy the values to generate the correct files from the UI/database. Now the auto-complete scripts look up the values in the database directly so I can usually just use tab completion.
- Wrote a script to take automatic backups from server to AWS S3 bucket and add that script to a cron.
- Wrote an all-in-one script to install/configure/setup WordPress on a barebone VPS server using a LEMP stack along with LetsEncrypt SSL. Fully automated but you need to have a domain pointing to the server for letsencrypt. Once you have a basic Ubuntu server provisioned, this script takes care of everything else.
- Playing with a simple script that sends basic server stats to an API endpoint (I know I could use monitoring tools but where is the fun in that :))
PM me for the code.
Here is the gist https://gist.github.com/christophstrasen/65461f921b5e8a8be23...
I received a new laptop for work this spring and was determined to make my downloads folder strictly temp storage, but loathed putting any sort of manual work into it periodically.
I have a small powershell script which moves all files in downloads older than X days into a 'dumpster' folder (not straight to recycle bin), and then all dumpster files older than Y days, into the recycle bin.
Scheduled in the task manager to run daily. And for the first time, I have a tame downloads folder.
browser script for work (web stuff): auto fill for creating test accounts, indicating which server I'm on, injecting little shortcut links here and there.
With help the windows utility shareX I can paste markdown via a shortcut and get a sharable link with html on my server back in to the clipboard.
not mine but great: I saw someone having readmes and todos in project folders and than aggregating them
This method put spaced repetition on an autopilot and helped me learn more.
 - https://github.com/dnote/cli
- Script/Bot, to download my favourite songs searching from multiple sources given its name as input.