Kaspersky never specifically claims a recent or Stuxnet infection of the ISS, just of a power plant, so he's probably referring to the above as an example of air gap not being enough to prevent attacks.
The commercial laptops (originally running Win98, then NT, then XP, now moving to Debian 6) are used for normal computing and interfacing with segment control. The segment computers (60+) which actually run the station are heterogeneous and bespoke aerospace-heritage hardware and software. The Russian side and American side especially are very different and almost entirely independent. Pretty darn sure there's no Siemens systems running up there, so Stuxnet wouldn't be a problem.
The embedded systems are probably not all that hardened, but they are not widely-distributed (to say the least) and are hard to get to, so it would be awful hard to target them. Possible, but mostly through attacking the developers on the ground, I should think.
The corollary to that prediction: wherever one finds software, one will also find bugs and malware.
So, bugs and malware everywhere -- in our phones, TVs, ovens, vehicles, factories... and space stations.
Perhaps some sort of a similar homeostasis will also be reached by software.
1) A virus provides some additional benefit. For example, if a virus' first stage acts as a sort of installer and pulls in some additional libraries for use, at some point of public infection rate there's a pretty good chance those libraries are available to anyone targeting programming for those systems. This becomes more feasible if AV cleaned the infectious and malicious portion while leaving libraries. Imagine, every system has libpcap one it...
2) Viruses start opening previously locked down portions of the OS to achieve their goals, and developers start taking advantage of these opening to extract more value for what they provide (in fact, being somewhat malicious themselves). This is already the case to some small degree with browser exploits. Keygen utilities that install malware seem to fit this somewhat.
This all presupposes a much more computer integrated and yet less computer aware society, which is where I think we are heading. More and more items have integrated computers and operating systems, yet we are much less aware of when this is the case (of course certain groups, such HN regulars, are more likely to be a aware of these items).
Edit: Here's a sample scenario for you: You turn on your new computer and attempt to use it to watch videos on your favorite site, but it doesn't work. You try again tomorrow and it does. What happened is that between then and now, a virus has infected your system and installed a hacked divx codec and made some changes in your browser to redirect you to some specific sites often. The author gets information about your system far beyond what is normally allowed, and you get access to videos as you normally did (possibly illegally).
I work as a malware and enterprise security analyst, and I can certainly see no current or future benefits to malware. I think the "cons" of them harvesting your email, banking, and social networking credentials, abusing your system resources to spam and DoS, and siphoning important information from your hard drive to a remote server certainly outweigh any incidental benefits that may occur.
Edit: s/Stephens/Stephenson/, + classic
Additionally, for point 2, locked down OSes usually have a locked down app store. This means those app store tester machines would also need to be compromised, as well as the app store testing suite. Otherwise the testing suite would notice a dependency on not-allowed functionality.
As for OS lockdown, while I can see why you went towards that interpretation, and on re-reading it does make sense, I was actually thinking of something much more general (so OS is a bit of a red herring). Any hole that allows root/administrator access (so any multi-user secured OS), or even something that loosens security in the browser to make more, probably less safe, access available to web pages.
You made me curious to know if we would literally die without gut bacteria.
The answer is no: "humans can live without gut flora".
(Note: no criticism is intended. I understand the larger point you're making.)
In nature, a virus or parasite can't spread too fast or hurt its host too much or its reproductive cycle will not be able to sustain itself. We see this in software too: a virus that is too nasty will attract too much attention and everyone will end up installing tools to detect and remove it.
So, while you may find the occasional symbiosis in nature between host and parasite, there is little reason for that to happen in the software world. It's far easier, safer, and potentially more lucrative to fly under the radar. For example, once your software infects a computer and you get access to their bank account, you only take $7.69 out once every couple months or so instead of draining it in one go.
Moreover, a virus could be deliberately designed to mutate, so as to evade anti-virus software. In fact, isn't this already being done since ages?
This link auto-plays a video with sound enabled for me (about Snowden, in the upper right corner, totally unrelated to the fine article about the ISS).
This doesn't do either.
Edit: This part seems to have been removed.
It is very difficult to not read a strong implication into that line. If nothing else, then the line is useless and states nothing at all.
FTR: I'm a Windows user.
And while yes, in-house software would be safe against most existing malware, and general malware, ISS would not be safe from direct attacks. Those are the real threat anyway...
While direct attacks are a real threat, I don't see anywhere saying this was a direct attack. Not using an off-the-shelf OS could have mitigated the attack if it wasn't a direct attack.
Kaspersky told the Press Club that creating malware like Stuxnet, Gauss, Flame and Red October is a highly complex process which would cost up to $10 million to develop.</quote>
really, 10mi to disable one strategic facility (or maybe N facilities) is expensive? that is probably the cost of a dozen smart bombs. And you can use the digital counterpart much more stealthily.
Sounds like a bargain.
>Kaspersky said he had been told that from time to time there were "virus epidemics" on the station.
Given the total lack of supporting evidence here, I'm going to stick a big ol'  sticker on this.
It's even been on HN before.
EDIT: Sorry! This is really grumpy.
"the Stuxnet worm incorporates several sophisticated means of propagation with the goal of eventually reaching and infecting STEP 7 project files used to program the PLC devices.
For initial propagation purposes, the worm targets computers running the Windows operating systems. However, the PLC itself is not a Windows-based system but rather a proprietary machine-language device."