It seems a bit silly for nearly all of the scripts to first ping Google before doing anything. It seems to want to check if internet is available (by GETing Google) before doing anything else. It performs this check using nc, however it also depends on either wget, curl or fetch for the actual HTTP requests.
I'd just drop the ping to google entirely. If wget/curl/fetch fail, so be it. No need to introduce additional slowness for the small chance internet isn't working.
In addition, the use of "nc" to ping google is bad. This causes the scripts to break in all places that use an http proxy. You already have wget/curl which honors the HTTP_PROXY environment variable. Just use that ...
Wow, qrify is pretty clever! Aren't there multiple encodings? I wonder how easily this could be used to generate the densest encoding?
EDIT: ...oh, it sends your data to a public service, which is not what I would prefer. But the cleverness of the service (rendering the pixels via ASCII) seems like it would make a great local utility.
I never really though about it. Had to put with c-shells and k-shells and then bash. Anything in particular which made it better? What super-power or feature would you lose if you went back to bash from this z-shell you speak of.
It's a million little things. tab completion is better in every way. It's not that bash doesn't have this feature, but it's better in zsh.
But the real magic is zsh + oh-my-zsh. oh-my-zsh is a set of curated aliases and other scripts for improving your experience with a lot of different tools.
For example, I use all these extensions:
plugins=(cargo docker git heroku mvn rust)
The git aliases in particular are really nice. Example, gpsup is:
Which is really handy. On top of that, I really like the support for the customizable ps1's. I use powerlevel9k, and have it customized to tell me about the time the last program took to run, current git status, current time, etc.
Metaquestion -- I'm interested in how the poster came across this this? I've been seeing things like this pop up in my google news feed for the past month or so -- including this specific article more than once.
Almost certainly yes. If it's not then this person is very unlucky. It's listed on urban dict and used in http://www.adequacy.org/stories/2001.12.2.42056.2147.html where it says Linux is a virus (ironically a translation of this was my first information about Linux ever when I was a gullible teen.., I clearly remember the 'LILO' part, since when I was starting to get into it Linux was already using GRUB).
(Note: it's mostly an educational tool; for example, you could enable an option to make climate print the actual command before executing so that you can learn your way around using the shell effectively.)
If there are some things that are really useful, you can quickly call them out in the readme, but even if you don't people will browse and figure out how to use them anyway if they want to. There are plenty of projects with poor or entirely absent documentation that still see use.
In some ways I could see how going through the effort of attempting to have documentation might just make things worse because then people might have an expectation that you're going to keep working on it or respond to filed issues. Anyway I thought the more common excuses were "eh, this isn't that useful, no one will look at it let alone use it", or "I'd have to go through everything and make sure I didn't leave my password in a comment or something stupid", or maybe the most common "there's a lot of hacky code, I don't want to have random people / a potential future employer see this and think I suck at coding..."
I had done a kind of framework to manage my Bash functions as packages (cf http://github.com/lolive/shinyshell). Even the documentation of functions and packages was tool-assisted (à la Javadoc). But documenting is a kind of chicken and egg problem. It requires an outsider point of view, that is difficult to have when you are in the developpent phase, and everything is in your mind.
Nevertheless i agree that documentation is a key part of the success of a project.
Some tools are for people who live in terminal but want to do things outside of computers- weather/currency/stock et al. May not be related to everyone.
But there are some tools which are compute related and I'd recommend everyone try once:
1) cheat. Total lifesaver. I used to backup my histories but I no longer do so since most commands I use have an example there.
2) qrify looks good, not sure how often i'll use.
About crypt: I'd suggest installing openssl and using tools in there. Crypto is hard to get right, not to dissuade anyone from trying to create but as an end user, always use something widely used.
I'd just drop the ping to google entirely. If wget/curl/fetch fail, so be it. No need to introduce additional slowness for the small chance internet isn't working.