Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Linux To Become Exclusive OS Of The International Space Station (redorbit.com)
321 points by moo on May 26, 2013 | hide | past | favorite | 55 comments


Finally! This story has been submitted so many times over the past few weeks, it's nice to see it finally get some traction.

Other sources may have different levels of detail, or different takes on the story, so in case anyone is interested in reading the story from other sources, here are some of the other submissions:

https://news.ycombinator.com/item?id=5668312 (zd.net)

https://news.ycombinator.com/item?id=5669927 (zdnet.com)

https://news.ycombinator.com/item?id=5677045 (linuxfoundation.org)

https://news.ycombinator.com/item?id=5680490 (extremetech.com) (4 comments)

https://news.ycombinator.com/item?id=5686586 (gizmodo.com)

https://news.ycombinator.com/item?id=5687720 (readwrite.com)

https://news.ycombinator.com/item?id=5689142 (venturebeat.com)

https://news.ycombinator.com/item?id=5695447 (telegraph.co.uk)

https://news.ycombinator.com/item?id=5711378 (vice.com)

Interestingly, very few comments or upvotes on any of them.


"Back in 2008, a Russian cosmonaut managed to take a laptop to the ISS that spread the W32.Gammima.AG worm to all the other laptops aboard the station."

Holy crap.


It is probably a wise move for them to move to Linux, having many other systems running Linux too.

But Windows security is a poor argument imo. If they had upgraded to Vista in 2008 (which was working quite well by then), or had better managed those laptops, that incident would not have happened. Now, five years later, they still have not upgraded to a modern Windows version. After so many years, anything is going to be more secure than XP.

Nowadays, you have to patch and update to have a secure system. Freezing the software to guarantee stability makes it inherently unsecure, Windows or Linux. There is no difference other than the larger base of existing virusses for Windows.

Also I wonder what this means for power usage, which is not unimportant in space. Windows still gets longer use out of most laptop batteries than Linux.


> Nowadays, you have to patch and update to have a secure system. Freezing the software to guarantee stability makes it inherently unsecure, Windows or Linux. There is no difference other than the larger base of existing virusses for Windows.

The extent and effort needed in keeping a updated system running is directly related to the size of the installed system. If all you got is ~30 carefully picked packages to create a fully custom ISS installation, its not that hard to go through every single line of patches that is needed. This is a huge difference between windows vista/7/8 where such review is not even possible to do and the number of "packages" is astronomical larger.

> Also I wonder what this means for power usage, which is not unimportant in space. Windows still gets longer use out of most laptop batteries than Linux.

I would almost call FUD there, but I will assume good faith. Back at the time with vista, linux system tended to have several times more battery time than windows. After windows 7 was release, and that linux had a power management bug, Ubuntu got a reputation for worse power management than windows.

Now after Linux 3 kernel, linux machine tend to again have much better power management than windows 8. There are some direct variation depending on what distro you install and what additional fluff that is running in the background. Also, EFI BIOS is know for a slightly worse performance for linux vs windows.


> I would almost call FUD there, [...]

This depends on the combination of hardware and distro. And Windows simply has the advantage that every hardware manufacturer out there is developing for Windows. So yes, you can get similar power efficiency with Linux as with Windows. And you can get better efficiency than stock Windows, if you are willing to work a bit. However, if you compare a stock $DISTRO install with a stock Windows, then chances are that you get less efficiency because the manufacturer did optimize for Windows.

In case of the ISS, I do not have the slightest idea if this matters. But I would not be too surprised if the ISS guys are quite happy because they now operate a 22W Ivy Bridge in place designed for a 60W P4.


If all you got is ~30 carefully picked packages to create a fully custom ISS installation

With 30 packages, what you've got is a toy, perhaps a single-purpose embedded installation, not a useful OS; and IMO the knowledge required to curate such minimalism has a high cost. I spend far more time updating a ~40G Linux installation than I do Windows - Ubuntu's package manager accosts me at least once a week. At least the pestering for reboots is a lot easier to ignore - I have a script in a loop on Windows, after once losing work to an ill-timed reboot prompt stealing focus.

EFI BIOS is know for a slightly worse performance for linux vs windows

No shit. I wasted a whole day - last Friday, to be exact - figuring out how to get Ubuntu 12.04.2 LTS working on a modern UEFI machine with full disk encryption and Nvidia graphics card. UEFI doesn't supply old BIOS text routines, and Ubuntu GRUB2 defaults don't work, such that when you switch from nouveau's hilariously unstable drivers (hard X.Org crash if you move windows too quickly) to nvidia-current you don't get a prompt at boot-up to enter your password - just a black screen of apparent death. You have to configure a framebuffer that works with the Nvidia card and live with NVRM's kernel log warnings about only supporting VGA text consoles.

Most of all I really resent Ubuntu forcing me to learn how all these pieces are put together, so I could fix it. Using it as a desktop OS daily, I remain more convinced than ever that it is best put to work via an ssh session in a different OS. The command-line is the only UI on it that reliably works.


The ISS will be running Debian[1], not Ubuntu.

[1] http://www.eteknix.com/international-space-station-starts-us...


> Most of all I really resent Ubuntu forcing me to learn how all these pieces are put together, so I could fix it. Using it as a desktop OS daily, I remain more convinced than ever that it is best put to work via an ssh session in a different OS. The command-line is the only UI on it that reliably works.

Why is it Canonicals fault Microsoft paid off (not exclusively, I know at Intel was also involved) and pushed for UEFI on all modern motherboards? Bios was working fine for a decade with Linux, UEFI has been terrible because there have been efforts in its adoption to only target Windows and obfuscate or just outright not support other OSes on some boards.


Your experience doesnt really apply to how production systems are managed. And 30 packages is enough and is useful. Production systems dont update themselves just because theres a new package in the repo, thats what people regular users do.

In production systems updates are rolled out only if there is a must have benefit, or a serious security patch.

I dont really get your point if you dont like linux then dont use it, its pretty easy to manage.


Anybody know if they run 2.6 or some newer kernel?


> If they had upgraded to Vista in 2008 ....

Keeping your Windows up-to-date might require updating (some of) your hardware too. And hardware updates are expensive if they have to be delivered to low earth orbit.


Yes. We have lab PCs here still running Windows NT because the upgrades for the hardware are either unavailable or would be prohibitively expensive (> $20K) and still not implement all the functionality. We tend to hide these from our well-protected corporate network behing a boat-anchor PC running XP with all the service packs and antivirus. We will run these systems until they can no longer be repaired because management does not want to spend the money to replace them.


> If they had upgraded to Vista in 2008

For that, they'd probably have to certify that all their software and flight-certified hardware functions correctly with Vista. They'd have to start certifying them for Longhorn in 2003.


I don't think that is that different from a move to Linux. Certainly, if you do a Linux kernel upgrade, you will want to recertify. And Linux kernels may not have longer shelf live, either:

- https://www.kernel.org/category/releases.html lists 2.6.32 as the oldest kernel with long term support. It was released in December 2009, not yet 4 years ago.

- Redhat supports their releases for 10+3 years. That is in the same league as XP.

- Ubuntu's long term support is 5 years.

Also, I think part of the problem is that astronauts will bring their own laptops. It may be hard to enforce them to run these long term support kernels (yes, that sounds stupid for an office building where access is that controlled, but space organizations have to take the psychological health of their personnel into account, too)


> if you do a Linux kernel upgrade, you will want to recertify.

The difference is that, if you automate the certification process, you can run it against the bleeding edge branches all the time well before the packages are incorporated into the official release. Any sign of trouble will happen months, if not years, before it's time to upgrade.

You can have a cozy enough relationship with Microsoft, but I doubt they'll let you put their sources on your Jenkins (or BuildBot) servers so that you can run your certification tests continuously. To say nothing about making the Windows build process CI-friendly. That in itself is an undertaking bound to take longer than the shelf life of any operating system.


Why is it always the Russians or Chinese? Why isn't there a American or Western European scientist/cosmonaut committing similar mistakes/blunders?


Different cultures, different attitudes towards these things.

I was recently in Aschaffenburg (Germany) doing interoperability tests with some of our networking equipment and a large SCADA vendor. About two hours into the tests, we ran into some problems, which we quickly identified as being related to the firewall on one of the test laptops in their lab having a very restrictive firewall rule (wouldn't allow ping). We had to wait for the better part of two hours while they got approval to adjust the firewall rules to allow ping from another device in the same lab. Note that this was a lab laptop, and these were technically sophisticated people - they understood the implication of allowing ping - it's just that their policy/procedures with regards to security were very, very strict.

I can guarantee you, in any interop tests I've done in Silicon Valley - there would be zero friction (even with the utility companies, enterprise software vendors, scada vendors) around opening up icmp on a lab laptop firewall.

I cite this as an example where Americans (at least those in Norcal) are more lax regarding security than another culture (in this case, whatever was prevalent in the part of Germany Aschaffenburg is located in)


You can guarantee that? I dont think so, Im not a sysop but if I was, you would need to explain why ICMP needs to be open.

Whether Im in Norcal or not. Most sane organizations are going to turn off ping, its not a cultural thing and it doesnt seem like you faced any friction since you got it opened in 2 hours.

Ive been working here in norcal my whole life and I dont see a culture of lax security in shops that have actual security requirements. Its not the country is the quality of the network administration team.


I've been a network engineer for 15+ years, and I'm saying that 95% of all employees I deal with in the Bay Area would not think twice about opening up a firewall rule and allowing you to ping a lab machine, particularly when the purpose of your engagement was to verify end-end network connectivity. They would make the call, themselves, on the spot.

The reason I bring this up, is that I've been told by a number of my colleagues that german culture, in particular, has a tendency to be more rule structured than in the United States, so, if there was a security policy in place, it had a greater chance of being adhered to in Germany, than in the Norcal.

And the entire point of my little anecdote was to make it clear, that there is broad diversity of attitudes towards rules, policy - and that the "United States" doesn't have a lock on following rules/policy.


  Most sane organizations are going to turn off ping
Why? That never made sense in my world. Now, I'm not a network ops guy, but .. why would you ever do that?

And that ignores related moves (like blocking all of icmp, while we're at it).


I don't know about Russian, but when I lived in China, I discovered that the Chinese expect PCs to have malware. From my tailor to our factories, every machine I used was infected. Opening a new web page on someone else’s PC always started a game of wack-a-mole to close all the popups. Even the pirated XP CDs they'd sell everywhere came with malware pre-installed.


Because you are reading American/Western European news.


Because people from poorer countries like these use their employers computers to download movies, games and music from p2p networks. The Westerners have more disposable income so they a) use their own computers for entertainment and b) buy stuff from amazon and itunes instead of searching for warez.

So its about income and culture. Company cars are also viewed as a perk and its not unusual to take a company car for a picnic.

Employers are usually fine with this - it improves morale and is less costly than actually paying more.


An interesting idea.

s|movies, games, music|games/software| and it might be more plausible. (.mkv, .mp3 or even the .rmvb that's weirdly popular in China are hardly common attack vectors.)

However, if we're talking about China, and the demographic of employees that get sent for international matters, then we're definitely not talking about people that can't afford their own computers. I would posit this holds for nearly any country (except North Korea?)

Taking your argument as just "a culture of grabbing untrustworthy software from anywhere is the reason", it becomes more persuasive to me.


Before assuming they are the ones "always committing mistakes/blunders" please provide source or at least some examples.

You are making use of the logical fallacy "Complex Question Fallacy" - http://www.logicalfallacies.info/presumption/complex-questio... - and I do not approve.


I don't think winter_blue meant what you think he/she meant. The question is why we hear these stories more often than american-made blunders. Which I am not even very sure we do.


That you hear of anyway.


You read the book just like the publisher published and finally press guy printed it.


A good and suitable use of Linux based systems. They can now install custom installations with packages that are individually vetted. They also got source code, so there is no limit on how careful reviews and (continuously) testing that can be made. During crunch time, they can even crowdsource.

The malware threat will also be lower, mostly thanks to not installing every single default program and service that windows come pre-install. So long they are not completely lazy and just install a massive DVD sized distro like Ubuntu, the attack surface and number of packages needed to vet should be quite manageable.


(slightly) more detailed article: http://www.extremetech.com/extreme/155392-international-spac...

in related news, Debian 6 apparently ships with support for the International Space Station, but still can't manage to sleep/resume peacefully.


It slept/resumed quite wonderfully here, and it continues to do so after the upgrade to Wheezy/Debian 7.

Did you break something?


More likely a bug in the ACPI support of the specific motherboard model.


If they still favor ThinkPads on the ISS they can expect a pleasant Linux experience.


Is anyone else horrified that they were running Windows?


On personal laptops? On machines that just do dietary stuff? No, it's okay.

It's a bit scary that this isn't the first time malware has got into space nor onto the Space Station.

Considering how intrusive the procedures must be for astronauts I'm gently surprised that intensive malware scanning isn't a requirement for anything that goes into space.


When I read that bit it dawned on me just how plausible all those worst-case alien-movie scenarios might just turn out to be... Oh sure, it starts innocently enough: someone shows up for work feeling slightly "under the weather," one thing leads to another and next thing you know you're transporting off-world egg-sacks back in your chest cavity.


"Malware scanning" is a pretty dumb process. All it does is to check for certain known signatures. Scanning for malware doesn't fix the problem.


Yes, I agree that malware scanning is dumb.

But in this case it would have prevented malware from being taken into space.


Which distro?



Now that is what i would call a "success story". How many distros can claim to be used in the space? :)

I'm glad to see Debian being recognized for it's rock solid stability.


On a similar theme, the developers of matplotlib once got a polite enquiry about a bug report which mentioned that "the feature is needed for the Phoenix project, and their arrival at Mars will be in March sometime."

Needless to say, the bug got fixed. A team from JPL is listed in matplotlib's credits for various improvements.


NetBSD: https://en.wikipedia.org/wiki/NetBSD#Examples_of_use

Agreed. The Debian developers should feel proud of the accomplishment.


It's not "The universal operating system" for nothing.


completely n00b question : how does Debian compare with Ubuntu ?


>> It's not "The universal operating system" for nothing.

+1

>> completely n00b question : how does Debian compare with Ubuntu ?

http://www.ubuntu.com/about/about-ubuntu/ubuntu-and-debian


I'm not claiming to be an expert but:

Ubuntu is built on top of Debian with some different parts and added stuff.

Ubuntu is more bleeding edge than Debian (unless you use Debian Sid).

Ubuntu is supposely more user friendly and gear toward desktop. Debian is supposely gear toward server but I've used Debian as a desktop before and server. I mean all the magical stuff Ubuntu have you can install on to Debian.


Finally it becomes the respect it deserves. I hope the project will receive more respect and openness by those damn hardware manufacturers.

Keeping hardware artificially closed, needs an end.


wow... So $150 billion ISS was using Windows XP until now?!!!


Great news.


How is it not already the exclusive OS?


it seems the debian guys are going to just drag us kicking and screaming to that destintion.


What do you mean?


Ah. you meant the exclusive OS for the space station. I was thinking far more broadly. :)


Laptops? On laps? Weightlessly?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: