Also don't use bash for running any scripts. You never should anyway, in a sane environment /bin/sh should not be bash - in Debian/Ubuntu it is dash which is not vulnerable. Unfortunately the Redhat derived distros do use bash as default /bin/sh. In the BSDs it is a standards compliant posix sh too. bash is for users not scripts.
First it encourages people to use bash specific stuff that is non Posix. Second it is a huge bloated bit of code thats ok as user interface, but scripts should use something that is more minimal. To avoid this sort of issue.
/bin/sh is the shell called by system(3) and used by portable scripts bundled with packages. Those things can't use anything but standard sh anyway, so having them run bash is overkill. "Don't use bash as your /bin/sh" isn't the same as "don't use bash as your interactive shell" or "don't write bash scripts"
> In the BSDs it is a standards compliant posix sh too.
In FreeBSD, it is tcsh in sh mode. Bash does the same too when invoked as sh[0]. It's POSIX compliant with extensions. There's still all the shell's code there, it's just that some of it is switched off by default, or its behaviour modified, to be compliant.
I guess I'll answer one part- so if you run this: $ env x='() { :;}; echo vulnerable' bash -c "echo test" in your terminal, you certainly appear to be vulnerable.
But I'm not knowledgable about all the default scripts that launch things on the mac. It's unclear to me if there are any standard processes on OSX that take advantage of Bash
By the time AcceptEnv has any effect, the user is already logged in and can run whatever they want anyway. If you're allowing untrusted users to authenticate to ssh, then, yeah, sure, but you'd be in a niche (and know it).
It can potentially be exploited via anything that shells out to bash with an environment that contains environment variables with values (that ultimately comes from) an untrusted source.
mod_cgi is just one of the most obvious attack vectors.
Tschinkel dug for termites in one or two circles and returned in 2007 to investigate, and hopefully prove, his hypothesis. “It took us about three days to establish, without a doubt, that termites were absolutely nothing to do with this,” Tschinkel says.
Not too much, just one thing: If we could simulate the universe then of course there would be a simulation in the simulation, etc... And maybe there is a fundamental yet undiscovered 'law of computation' that a sufficiently complex simulation can embed itself only on a slower timescale, so even if everything is deterministic, we even theoretically cannot calculate fast enough to calculate the future.
Embedding itself is one thing, but what about embedding a small portion of itself? Suppose the area to be simulated was merely the Solar System (with the rest faked), and the size of the computer far larger...