Hacker News new | past | comments | ask | show | jobs | submit login
Mars becomes the 2nd planet that has more computers running Linux than Windows (twitter.com/mikko)
1185 points by fireball_blaze on Feb 19, 2021 | hide | past | favorite | 283 comments

Here [0] is the paper describing the hardware of Ingenuity in more detail

Things that stand out to me: It uses mostly off-the-shelf electronic components that are only automotive/industrial grade!

- 2.26 GHz Quad-core Snapdragon 801

- Texas Instruments TMS570LC43x (2x for tolerance)

- Sony 18650 LiIon batteries

- Zig-Bee to communicate with the rover

The only part that is somewhat special is the radiation tolerant FPGA ProASIC3 that ties everything together and takes care of power cycling other components when they lock up.

Too bad that they probably will only fly it a few times as the rover has to move on and it's just a tech demo. I so wish it will follow Perseverance on her mission, that would be so awesome to see. It's certainly capable of doing that!

0: https://trs.jpl.nasa.gov/bitstream/handle/2014/46229/CL%2317...

You made my day! I worked on Snapdragon 801 many years ago.

It doesn't look like Ingenuity is using its GPU for anything, but knowing thet a tiny bit of the work we did is now on another planet fills me with joy.

This must be such a cool feeling.

Congratulations. And thank you.

You know.. thank you.

Considering that Perseverance is running your standard rad hardened PPC from the late 90's, I think that speaks a bit to the expected lifespan and importance of Ingenuity. If the POC goes well, I expect we will see hardened components in the next revision. Then again, if Ingenuity is doing anything too modern, I doubt a PPC 750 will be able to keep up.

How are they doing machine learning with PPC from the late 90s?

It doesn't do machine learning. It does stereo reconstruction, hazard analysis, map generation, and path planning using stuff like D* and Gestalt.

It also physically moves very slowly, which gives the ancient processor time to keep up.

It is the main processor, tying together a number of hardened FPGAs. They’re also old, but the point is they can be reprogrammed from Earth to do whatever NASA needs.

I wonder if the zigbee is simply being used as a serial bridge between a UART on the rover, and the snapdragon 801 on the helicopter?

(reading the pdf, yes it is). I imagine there is not much concern for a urban noise floor or choosing a clean channel, as there might be with city use of zigbee here on the earth. In some places the ISM 900 band can be completely obliterated by smart meters and baby monitors and stuff.

I'm shocked! They used the non-ROHS ProAsic3L(ead)! How dare they?

j/k lol ;-)

You kid, but space applications are one of the specific exemptions to RoHS. Basically because tin whiskers are extra bad in space.. lead solder is still specified here.

Nobody tell him how they power this thing.

Nuclear-powered Linux computers on Mars...

Only until they undock it from the rover, then it will have to survive on it's solar panel! It's also probably the 'coolest' linux ever as temps can go down to -100°C on Mars at night and they are only heating the batteries.

I love Mark Rober’s explanation of this. See the lemon battery video at 9m30s https://youtu.be/a1D-fZP8qJk

The flight software and embedded systems framework for the Ingenuity helicopter is called F' (pronounced F Prime) and is open source. Find it here:


> pronounced F Prime

Some people really hate Amazon!

or they are paying respecs

or they are just taking the first derivative of ƒ. who knows, really!

which is likely a common operation in flight control software ;-)

or really dislike composite numbers


f' the father of f#

C++ and Python.

I always thought NASA used languages that were easier to verify (functional languages?). Good to see that the common boring languages are good enough!

I have not looked at any of it, but would think they'd back it up with some form of math proof for at least some modules, and then applying those algorithms is just like any other C++ or Python code.

People keep saying functional languages are easier to verify but most practical verification tools (model checkers and the like) are built on top of imperative languages.

Ingenuity is a tech demo, so they don't go all out to conserve limited development resource and uses mostly off the shelf hardware and software.

That's the framework, but is the actual flight software open source? I highly doubt it.

I don't see why it wouldn't be...

It's not like NASA is worried someone else is going to steal their software and send their own rover to mars using it...

More likely, they may be worried someone is going to find a vulnerability and take control of their multi-jillion-dollar piece of equipment.

At least in this case physical access is an extremely improbable attack vector...

If a third party can break into the satellite and communications firmware of two different spacefaring nations (you have to hijack an orbiter too) then there are bigger problems to worry about. The number of countries (counting the ESA as one big country) with objects in orbit around Mars can be counted on one hand.

Well, I asked people who are world class competitive security competitors and had succes in HACK A SAT how's the satelites and stuff security and they said that it aint good

On the other hand, I’m not sure satellites launched years ago have auto-upgrades enabled and up-to-date security policies (and certificates!), especially upgrade policies for the embedded chipsets. That must be an interesting problem to solve.

True. I'm not saying they should be worried about that at all, but it seems like something somebody might be worried about anyway.

That occurred to me as well, but I imagine this is separated from the communications.

Maybe it's not controlled at all... it does say autonomous, after all.

I imagine all commands have to get signed for the rover to accept them, but even if their keys were compromised and some vulnerability was found in the software, wouldn't you also need a network of really big dishes (see DSN) to actually send the commands to the rover?

> Wouldn’t you also need a big network of dishes

Nah, you can subscribe to big-dishes-on-demand using AWS Ground Station.

Wait, they did exactly that: https://aws.amazon.com/ground-station/

While some innovate, once again AWS reaps the margins of the whole space conquest by discretely providing the infrastructure that everyone needs. Clever!

I honestly chuckled because of your AWS parody.

Than I clicked the link...

Same. Now I'm wondering what they're not offering.

A lot of speculation here, but the real reason is likely a mix of export control (ITAR) and proprietary components. Essentially, anything that could potentially be used in a missile is very tightly controlled technology and is difficult to release to the public.

Generally these things don't get open sourced. They really should be in the long run but these things take time.

Even Journals don't insist on code yet unfortunately.

They might have licensed closed source components from third party vendors that prevent them from open sourcing the whole thing.

Maybe, but I doubt it. I'm currently working on another NASA rover project (going to the Moon, not Mars) and we don't use any third party code. Our code is based off of another open source NASA project (Core Flight System), but we don't open-source our exact flight software. Mostly this is simply because we have to go through a lengthy vetting process to ensure we don't inappropriately release anything that falls under ITAR or EAR, and we just don't have the extra time or budget for it.

Typically at NASA it's the more open-ended research projects that have time and support for open sourcing their code. There's less benefit in releasing a specific rover's flight software than there is in releasing the general framework that said software is based on.

Excellent info, thank you!

Prime as in Optimus lineage? Rover! Transform and roll out!

Tangentially related, but Optimus Prime has a drone - Roller.


The GPL will soon infect the whole galaxy.

Big surprise waiting for aliens that plan to steal the tech and modify it for its own purposes... without publishing their changes on Earth :)

> There’s no point in acting surprised about it. All the source code has been on display at your local galactic version control system in Alpha Centauri for 50 of your Earth years...

Yes, in a maze, with broken bulb, stolen staircase, and put in a cupboard with a sign "danger! jaguar inside".

Not really, but I know which bit you mean: "It was on display in the bottom of a locked filing cabinet stuck in a disused lavatory with a sign on the door saying “Beware of The Leopard“"

Okay, I admit reading it in another language many years ago.

Honestly, that's relatively mild compared to some other "begrudgingly releasing our GPL code" implementations.

I suddenly want to read a SciFi novel about a lawyer dispatched to Alpha Centauri as first contact to establish that the radio signals emanating from their planet infringe on Got to Give it Up by Marvin Gaye. Hey, if Snowpiercer can get a Netflix series then this is a slam-dunk for green lighting - let's serialize this while we're at it!

Year Zero: A Novel by Rob Reid is one where the entire galaxy got hooked on listening to our pop music that's been transmitting this whole time and how Earth is now owed insane amounts of money due to copyright.

Wow! This is astonishingly similar subject matter!

Just checked the ebook out from my library to read this weekend. Thanks!

Only tangentially related, but I quite enjoyed this short film on the topic of interstellar law: https://www.youtube.com/watch?v=rv8kOzRZK8g

A TV series based around the bureaucrats from A Hitchhiker's Guide to the Galaxy. Count me in.

They only have to publish the source code to people who receive a copy of a binary.

Only if their legal system recognizes earth copyrights. They have no reason to do that.

We steal genes from fungi without rewarding them for their work, and do you know what they do to silk worms? Why should advanced aliens treat us any better?

The IMF will offer them big loans with the stipulation that they “harmonize” their legal codes to include IP restrictions.

It's alright - the Trans-Galactic Partnership Treaty has been confirmed and all the signatories agreed to unilaterally include all unsigned lifeforms in the Galaxy. Honestly if these folks wanted a better deal they should've come to the table.

The problem is the IMF can't get them anything. At the speed of light they can't even ask us to send anything, they and us are at the mercy of what the other chooses to send.

Hmm, what about systems with just single star type object or more than two ? Sure, there migh be more binary star systems in the galaxy than the others but it still seems like a significant unaddressed edge case.

That's why we sent GPL software. It will destroy the alien civilization from within!

The GPL doesn't "infect" things. It merely requires software that uses it to be free software. For instance, you can extend GPL software, and dual-license your extension under MIT and GPL, though the GPL applies as long as your software is using the GPL component.

In this case, the software you've written to extend the GPL, is not "infected" by it.

It does not REQUIRE you to use only GPL, it requires you to make it AVAILABLE under GPL.

It only requires relicensing to GPL if the non-GPL software is "derived from" GPL component. Otherwise GPL doesn't apply except for the explicitly GPL licensed components.

At least with GPLv2

For a second I read it as GNU Propulsion Laboratory.

I am reminded of the types of OSs in Vernor Vinge's "A Deepness Upon The Sky" universe.

May that is how the economy in star trek universe came about.

Somewhere in a galaxy far, far away: https://xkcd.com/344/

First I read this as JPL, which probably the same thing can be said about.

Please stop using the word "infect" when talking about GPL. Infect is a negatively biased term. It's not balanced regarding the benefits GPL software brought to many people.

`Infect` accurately describes a particular view of GPL. This is akin to saying "please stop holding that opinion, it it doesn't match my own opinion, therefore you shouldn't hold it either"

A term can be technically accurate, but biased because of its connotations. "Infect" has incredibly negative connotations. Why not say cross pollinate?

Recently in the UK we have pundits talking about migrants as "infection vectors". This dehumanisation of humans coincides with the government illegally and immorally banning asylum seekers.

The bias is the point. It's expressing an opinion. This isn't a government scientific report where the language should be as devoid of emotion as possible, this is an internet chatboard. Expressing opinions is the whole point.

This is progress: previously you were pretending the term "infect" is neutral.

People are entitled to opinions, but we're entitled to call out negative ones.

> previously you were pretending the term "infect" is neutral.


I said "`Infect` accurately describes a particular view of GPL."

This is true, and it says nothing about that point of view being neutral. In fact, quite the opposite.

> This is true, and it says nothing about that point of view being neutral. In fact, quite the opposite.

Maybe I should have said 'legitimate'.

Basically, I disagree with your prior indignation.

> Expressing opinions is the whole point.

To say I was thinking it was about actual arguments the whole time

Would you need an argument to support it if it wasn't opinion?

... yes ? That's how debate works

Debate is literally all about contrasting opinions.

>I feel X is the solution to Y!

>No! Z is clearly the solution to Y.

Those are not arguments and not a proper debate (which is something that has quite codified rules due to people studying that for 2000+ years about what constitutes or not a correct argument - see this to look what aren't : https://iep.utm.edu/fallacy/#H6).

Do you really not study that in school ? This is scary

Do you really think hn should be exclusively formal arguments presented in an emotionless manner at the expense of personal flair, enforced by limitations on the diction permissible when discussing particular topics? That's scary.

... pretty much yes ? that's definitely what I'm here for. If I want emotion I read a novel. Here I just want raw information in order to optimize my work & life, or discuss on the projects that are of interest to me in order to make sure that there is enough public interest on them so that they keep being maintained until I don't need them anymore.

But then, I long for the end of individuality, which may not be your opinion. :-)

If that were true you wouldn't even read this thread, let alone take the time to respond to it.

I think there's a real difference with this usage compared to "infection vectors" as used by pundits - everyone in the UK is an infection vector and migrants aren't particularly more likely to be infected than any other travelers[1] so their usage is a clear mis-attribution intended to slander a class of people. I also, honestly, will tend to give a lot more benefit of the doubt to slander when it comes to living breathing humans compared to software licenses but I'm trying to suppress that in this line of discussion.

1. As far as what I've seen reported.

I periodically say things with a positive connotation then flip to saying it with a negative connotation in the same paragraph. More often than not, it is intended to illustrate how we say things is intended to shut down conversation.

Claiming that the GPL infects software is very much one of those cases. Worse yet, it is an inaccurate statement since the GPL does not infect software. The most you can say is that some people choose not to modify, extend, or link to GPL software since they do not want the apply the terms of the GPL to their own work. That is a legitimate concern. Implying that the use of GPL software along side proprietary software somehow infects the proprietary software is intended to set up a negative connotation that has no basis in reality.

it is like the difference between "freedom fighter" and "terrorist" depending on your bias same people/things may be called differently

It's an opinion of what the GPL is and by definition that's okay whether negative to you or not. I personally think it's kind of funny even though I strongly support both GPL and MIT as valid methods of opening up software. My opinion of "infect" is also valid as is yours. That's the beauty of free speech.

I think the usage of the term tells you something about the managers that use(d) it: For them it was really like an infection. Their developers came in contact with that free software and all of a sudden their big enterprise corp had legal obligations. It must have come as a real surprise that enforceable licenses are not a one-way road from enterprises to consumers.

Maybe we can say that GPL is contagious instead?

Infect does have a negative connotation but I can't really think of a non-negative term with the same viral connotations and that side of the connotation is rather accurate. The GPL aggressively applies itself to full code bases that adopt it - it might actually be a bit more accurate to call it cancerous I guess?

At any rate it is a rather negative word but I don't think it's fair to say that "infect" is a mis-categorization of the behavior of the GPL. And all this from someone who does personally appreciate MIT licenses more but is quite pro-GPL licensing on code.

Your whole understanding of the GPL is a mis-categorization of its behavior.

The GPL does not "aggressively applies itself to full code bases that adopt it", that sentence is even completely meaningless. I can't even find the beginning of what you wanted to convey regardless of the side I attempt to take it. If a "code base adopts the GPL", then yes ok "it" adopts it, is there a problem? Why would anything attempt to "aggressively" apply the GPL licence to a code base that adopted the GPL? If it is rather than the GPL aggressively applies itself up to the point code bases that adopt it are fully converted to it but were not before, then that still makes no sense obviously because 1. a license is not sentient and won't attempt to aggressively do things by itself, but also 2. because code under other licences exists in code bases and systems containing code licensed under the GPL -- some under licences compatible with the GPL, other under licences not even compatible with the GPL.

To finish with that characterization, please provides the plenty of examples that should surely exists of projects that had to switch to the GPL against their "will" because of its supposed "infectious" or "cancerous" nature (not just dual licence or use some compatible licence, switching). And then please provides among this list, the list of the projects where it happened for another reason than lack of due diligence by the copyright holders of pieces of code supposedly "infected" or tainted with "cancer" against their will.

A final note: there at least 2 more ways (that are not open-sourcing) if somebody did a mistake and based a work on GPL licensed code but should not have done that because the conditions of the GPL are not appropriate for it: rebase the project on non-GPL licensed software; or stop it - or at least stop its distribution. Nobody will even manage to "retroactively" 100% force you to open-source proprietary software. And nobody can force you to change an open-source licensed code compatible with the GPL to the GPL itself, I don't even see what would be the point of asking to do that change.

I work primarily developing private commercial software so I need to be very careful about the libraries I pull in and, when dealing with recursive dependency management programs (stuff like NPM) making sure that the package manager either blocks GPL licensed code or else verifying licensing on each update. If GPL code makes it into our codebase we need to dedicate additional time specifically to one of the remediations you described - they are doable but they take time, sometimes you'll be able to find a non-GPL licensed equivalent, sometimes you'll need to reinvent the wheel but either way expect some logic to subtly change and re-evaluate all your assumptions.

I would agree to the general point that using GPL'd code is entirely voluntary and using AGPL code is generally not an issue for most software projects but I disagree about it being a miscategorization. If I directly use 10 NPM packages and those pull in an additional 60 and one of those happens to pull in lpad which just happens to be updated from MIT to GPL licensing then I need to realize that and then either hackily re-wire the vendored code to not use lpad, possibly hardwire the packages file to force the pre-GPL version, or else drop the package that depends on it and the package that depends on that one and replace the whole component - this could hit multiple packages as well and all because of a single GPL license change.

I do dislike the connotations of infect but I think it's a fair term for a quality where one of a thing can cause hundreds of other things to suddenly become unusable, the GPL license can cause an exponential growth of packages unsuitable for general commercial use.

Not the parent, but here's an example of the converse: VS Code does not play any videos because it does not bundle ffmpeg because doing so would "infect"/burden it with heaps of GPL requirements.

https://github.com/microsoft/vscode/issues/100599, https://ffmpeg.org/legal.html

The G in GNU shall stand for "Galactic"!

Please continue to use the word "infect" when talking about GPL. It exhausts the energy of those who police usage of words by perceived connotation, an activity that distracts from and interrupts fruitful discussion of more meaningful topics. Exhausting their energy is beneficial to us all.

Its also 2nd planet that has more machines then people :)

The only planet completely populated by robots.

Venus is completely populated by dead robots

Is it though? Are the robots even recognizable as robots, or have they totally melted into slag?

Dead things remain dead things even when they are unrecognizable - if I state that oil is partially composed of dead dinosaurs most folks won't find that statement disagreeable.

I think it's pretty fair to describe the remains as dead robots similar to folks talking about their coke-bottle decks (recycled plastic) or even a sword forged from the remains of dead robots - I'd still consider that to be made of dead robots even though very little remains outside of the mineral composition.

> if I state that oil is partially composed of dead dinosaurs most folks won't find that statement disagreeable.

I mean, aside from most oil being composed of dead plants rather than dinosaurs (if I remember right). Unless you're counting the bits of dinosaur absorbed by those plants before they turned into oil, but I feel like at some point they stopped being dinosaur and started being plant.

We are star stuff, I guess fits here.

Venus atmosphere is hot and corrosive

They’re mostly steel which does not melt at those temperatures (and the acidic parts of the atmosphere are much higher altitude), 400-500C. So yeah. Recognizable.

Talking like a true survivor of Venus atmosphere adventure...

PoC or GTFO.

Is a robot that becomes a melted slag of metal not a dead robot?

It certainly isn't a live robot!

IIRC, surface temps on Venus are hot enough to melt lead, but not hot enough to melt whatever sort of alloys were likely used.

It does have ongoing volcanism, though.


The other machine adventures are not even on planets :)

That we know of.

This makes me imagine some secret alien civilization living in the depths of Venus, using its atmosphere as cover

More electric vehicles than ICEs, more EVs than bicycles too!

Wasn't it the first?

I have a few dozen machines just in my apartment.

Pretty sure we're outnumbered.

Data from 2015: https://www.washingtonpost.com/graphics/business/world-ip-ad...

IP Addresses per person.

I think that by now (2015 --> 2021) and by making smartphone data packages cheaper, we all have a couple of IP addresses. Of course for anyone in an office, or behind a home router, it may look like there is a single IP address, while someone may have a smartphone, laptop, 1-2 tablets, a smart TV.

We are definitely outnumbered.


I doubt machine still lives in any form...

I just hope it's the first planet with intelligent life.

I wonder why they used Linux? I'm not that familiar with engineering at NASA or JPL, but I thought computers used in exploration were running real-time operating systems. Is Linux capable of this?

When I worked on military autonomous vehicles (which I’d expect to be similar to these) we always had at least two system on board: 1) A real-time flight controller, 2) One or more Linux computers, networked via an on-board LAN which performed all the other tasks.

I worked with some people of that particular JPL software team to setup a similar setup like the Mars helicopter for a earth UAV.

Can confirm that multiple real time systems are used. They are controlled by a non real time Linux system.

VxWorks has been battletested in NASA space probes for over 30 years.


I recall an early bug in the 2004 MERS duo. They were the first to use flash memory and its new drivers. The file freelist busted and the OS kept on rebooting. Fortunately a fix was uploaded from Earth.


Fun read, tyvm! I loved this anecdote from the second link:

> On sol 20, the command team sent it the command SHUTDWN_DMT_TIL ("Shutdown Dammit Until") to try to cause it to suspend itself until a given time.


> It seemingly ignored the command.

The fix was basically, there are too many files in the flash drive, causing it to reboot endlessly because the booting process could not handle so many files in the flash. Solution was to delete some files.

Mo' files, Mo' problems.

the pathfinder also had a priority inversion bug resulting in deadline trips. same story, they patched it...

Real-time Linux is a thing, although it was maintained separately, outside mainline Linux, for a long time. About 4-ish years ago, the project got some decent funding and is now part of Linux Foundation:


I worked at NASA Armstrong for a summer. The have an entire team there that at least a few years ago worked full time on real time Linux. IDK if that was used in this case or not.

The helicopter uses a Qualcomm Snapdragon 801, which doesn't appear to be radiation hardened. I'm guessing their usual BAE RAD750 PowerPC rad hardened CPU was too heavy to put on the helicopter.

another thread noted that the hardened CPU was not fast enough to do the required sensor fusion and responses. In the case of a severe fault, the 801 can be restarted fast enough (while in mid flight!) to not crash the vehicle.

I think the primary reason is that Ingenuity is not considered essential part of the main mission, so they could use non-qualified COTS parts like Snapdragon and Linux.

The Helicopter is also not operated in space, so radiation might be less of an issue. It's intended only as a flight demo anyway.

The Martian atmosphere and magnetosphere are both substantially worse at blocking radiation than those of Earth, so I'd imagine radiation would, at best, only be marginally less of an issue than, say, in interplanetary space.

If memory serves, the radiation environment of the Martian surface is not too much worse than LEO. If so, radiation would probably not be that much an issue.

More likely just too power hungry

apparently list price for the RAD750 is $200,000! could you not externally harden the snapdragon for less?

That's not how hardening chips works.

You need a specialized fab manufacturing, etching and packaging process for this and it's not like regular fabs are cheap to begin with. Plus you're working from the start with much larger nodes with dedicated cell libraries so everything has to be designed from scratch to fit that node which means you can't reuse consumer off the shelf designs very easily.

Then there's the lack of economies of scale in building such custom parts in small numbers. I imagine if Apple would only order 100 5nm chips per year from TSMC, the unit price would be equally eye watering.

I've only read this doc quickly [1]. I would have guessed a lead case of a particular thickness would do the trick. But I don't know the thickness needed, maybe too much?

[1] https://nepp.nasa.gov/files/25295/MRS04_LaBel.pdf

In theory you could just cover everything in lead and call it a day, IF, lead wouldn't also be one of the heaviest metals in the world, which kinda goes against the mantra of space travel.

Actually it doesn’t work that way (solely based on mass). Radiation doesn’t just get stopped by lead, it generates secondary particles. Lead happens to be effective against x-rays, which is why we tend to think of “lead shielding” but it’s not effective against all types of radiation. So it would help but you couldn’t just call it a day, and other materials are more mass efficient.

yeah, but it'd only be 3/8 as heavy on Mars!

seriously, that's something I haven't thought much about. when designing rovers, do they calculate solely on the weight of what it will be on the destination, or limit it to weight limits of escaping earth's gravity well?

Rover designer here. The weight at the destination is important to characterize performance but it’s not a big driver of design. For example, once you consider losses due to friction inside gearboxes, there’s no big difference to size a motor. You want to be able to test on Earth easily anyway, so you wouldn’t design without a margin to allow that. I suppose if you designed a rover for Uranus you might take a different approach.

Launch mass is important as a constraint, but the launch environment is the main design driver for mass. The rover must be much stiffer than the rocket to not “couple” it’s response. The first vibration mode (think tuning fork) of a rocket is about 20Hz so the spacecraft inside needs a first vibration mode higher than 40Hz. Something inside the spacecraft similarly needs a first vibration mode higher than 60Hz or 80Hz, although you can make exceptions based on analysis.

But that’s not all, the sustained load on components during launch can easily be 10 to 30 times gravity (30G) and the instantaneous load can be 100G. You can’t just add material for these kinds of loads, it would be an endless feedback loop because adding mass decreases stiffness. Look closely and you’ll see that every deployable or movable part of the rover is locked down by a mechanism until it has landed.

It's not just the mechanical weight at the destination and the work required to escape earth's gravity (which is enormous). Mass of the rover also means more fuel to speed up and slow down if there is any delta-V changes en-route, the heat energy that needs to be dissipated during re-entry, the forces on parachute and lander as well. So really, that mass penalty gets paid over and over... and not often linearly.

Weight on board the rocket is the most important consideration as payload weigh is the limiting factor.

Lead coating a critical and embedded component like a CPU, or even a SOC, has gotta be barely significant for the weight of the payload.

I'm guessing the faster CPU is just not necessary for the core rover, so via KISS, use the proven chips.

Ha, if you think that's a lot, I've got some news for you. That might be the price for just the CPU, but the price for the avionics package (basically a RAD750 with the necessary boards to manage power and IO) is a whole lot more than that. (I'm probably not allowed to name the exact number, but you're going to need to add a zero for sure.)

no way I'd be surprised by it. When you only sell 4-5 a year, its going to be pricey!

Is it known what flavor of Linux they're running? E.g. are they using real time modifications to the kernel or something?

The description "real-time" varies in meaning depending on the application and acceptable average and maximum latency.

The un-informed usually thinks "real-time" means "hard real-time", but that's seldom necessary, so one can save on additional expense and effort by using a regular linux distro and removing most of the daemons and file location indexing.

I've done real-time development (for Space Shuttle, rocket and balloon projects), and largely all I care about is if a circular buffer can be emptied in time before it fills. That's one technique for avoiding latency variation issues.

The versions of linux that you would normally encounter aim for music real-time, which is about 10 ms latency. 30 ms is considered to be bad.

Most of the pro Yamaha synths use linux as the embedded OS, and some of the code is downloadable (they attempt to comply with the letter of the GPL.) So you can go down to Guitar Center and do a real-time test anytime yourself. :)

The iPhone is pretty good for music, because it was designed to have low latency when playing.



Looks like Wind River discontinued RTLinux, which was hard real-time (a real shame actually, as it removes one of the few hard real-time options):


As someone that worked at Microsoft on the XAudio 2 for the Xbox 360 & Windows Phone stacks... holy heck that android audio path is painful to read. >.<;;

We went with a 5.33ms quantum on the Xbox 360 (vs. Window's 10ms), and tried to stay out of the way as much as possible to ensure minimal latencies: https://docs.microsoft.com/en-us/windows/win32/xaudio2/xaudi...

The Akai MPC X and its siblings run Linux too. Based on buildroot according to someone that looked into it.


However I don’t remember the GPL being mentioned anywhere when I had my MPC X. I sold it recently because of not having any money. But I hope to own one again in the future. If I ever do have one again I will probably have a closer look at what it says about the GPL, and then try and get a copy of all of the open source portions of the firmware directly from Akai.

Just as an aside, you probably (and hopefully) don't need to have the unit to download the manual—which is where the licenses are likely mentioned. Though this one seems to direct the user to Akai's site.

E.g. https://www.mpc-tutor.com/akai-mpc-manuals/

> Looks like Wind River discontinued RTLinux

Xenomai is still under somewhat active development.

This sounds like an old slashdot headline

From the open-source-alien-robot-overlords department.

Following reports of a new alien war machine in the ancient Fal'leesh river delta, K'breel, speaker for the Council, stressed that again, there was no cause for alarm:

"This is the last, futile gesture of the disease-ridden apes that foul the sinister blue planet third from our star. We will persevere, no matter the risks, no matter the costs. Our gelsacs swell with pride at the thought of the Enemy's inevitable self-immolation augered by their fitful attempts to travel among the stars."

When Junior Reporter #AXI-1138 of the Celestial News Network attempted to ask the Speaker whether there was any truth to reports of the machine's successful landing, activation, and telemetry transmissions, K'breel called it fake news and ritualistically crushed the reporter's gelsacs with the lectern's Bhan'ammer.

Ironically Mr Mars himself, Elon Musk, tried to get PayPal to standardize on Windows NT.

Over 20 years ago. Now SpaceX and Tesla run on Linux.

Only because the Elders of Mars telepathically contacted Elon to tell him that colonists using accursed Windows would find little purchase on the windy plains of Vastitas Borealis, and their little colonies would end in the blue screen of death.

Alright, I'm going to be that guy. What's the other planet with more Linux than Windows?

Earth, with a broad definition of 'computer' you land on cellphones too, Android (or even Tizen) is used on like 90% of all new phones sold today worldwide.

Edit: To be clear, I was noting the simple fact that many probably don't at first think of an Android cellphone as a computer.

It's not a particularly broad definition of "computer".

32-bit and 64-bit cellphones are made of synchronous VLSI ICs running at gigahertz clock speeds including a few gigabytes of byte-oriented DRAM and superscalar multi-core ARM CPUs with single-user GUIs displayed on an LCD running Linux and software written in C, Java, and JS, plus a GPU running OpenGL, storing their data on Flash, running on a few watts of power and globally networked over Wi-Fi and TCP/IP. They have peripherals connected over USB and the SD card bus, and also CSI.

This 64-bit laptop is made of synchronous VLSI ICs at gigahertz clock speeds including a few gigabytes of byte-oriented DRAM and a superscalar multi-core amd64 CPU with a single-user GUI displayed on an LCD running Linux and software written in C, Java, and JS, plus a GPU running OpenGL, storing its data on Flash and spinning rust, running on a few watts of power and globally networked over Wi-Fi and TCP/IP. It has peripherals connected over USB and the SD card bus, and also SATA.

These are pretty much exactly the same.

The definition of "computer" already has to be a lot broader than that to include both the 24-bit SDS 940 with 192 kibibytes of magnetic cores and 96 megabytes of spinning rust, running an instruction every 5 μs or so, with no GPU and analog video output hardware made out of vacuum tubes and TV cameras, on top of the Berkeley Timesharing System and serving six simultaneous users, on which Engelbart demonstrated The Mother of All Demos in 1968, and this laptop.

It is transparently absurd to suggest that "computer" should include both this laptop and the SDS 940 and its predecessors like the IBM 1401 (vacuum tubes, no transistors, decimal memory, punch card I/O, no operating system, no multitasking, variable-length instruction operands, millions of times slower and less memory), but not cellphones. Compared to the differences between the 1401 and my laptop, the differences between my laptop and the cellphone are totally insignificant.

It is true that the vulgar and ignorant often do not understand that their cellphones are computers. This allows them to be more easily taken advantage of by companies that want to reduce them to consumers instead of participants in creating culture. Instead of aping their errors, we should work to help them understand the true nature of things, because ignorance is not a sin—it's a punishment.

Because sunlight is the best disinfectant.

> reduce them to consumers instead of participants in creating culture

But you miss the fact that the new generation making movies and documentaries with these iPhones is creating culture! Leaving aside the distinction of computer/cellphone, the iPhone is just a very powerful tool to the new generation, and in some ways, they'd argue they can do more with it than with a mere 'computer'! And, in taking down the barriers of entry and making these computer cellphones easier to use to create new content, one could well argue culture has never before flourished as widely as it does today.

I'm very aware of that, and I think the availability of such powerful hand computers is a very important development, one that changes many things and holds enormous potential for improving the human condition. That's one reason I think it's important who's in charge of who gets to use these tools to speak, because that's going to privilege certain voices and suppress others. In addition to directly hurting the people suppressed, suppressing too many voices leads to collectively irrational decisions like the catastrophic mishandling of the covid pandemic in America and Europe.

I don't want hand computers to go away. I just want them to be loyal to their owners, not to their manufacturers.

Without going too deep into specs, most phones nowadays are more powerful than 90s workstations and can run software. If that’s not a pocket computer I don’t know what is.

> most phones nowadays are more powerful than 90s workstations and can run software

my iPhone runs Windows 3.0 (emulated in JavaScript even!) faster than I remember it being when I used it on a 386 in the early 1990s


Mobile phones are way more capable computers in every way but the keyboard than the Commodore 64 I got my start with.

Most phones built in the last decade the majority can accept a Bluetooth or USB keyboard. If you were so inclined, I suspect the exact keyboard from your Commodore 64 could be connected to the phone of your choice with nothing more than off the shelf adapters.

I had to look this up: Android is based on the Linux kernel.

Suddenly Android isn't Linux anymore when it comes to discussing malware prevalence on Windows vs on Linux.

pretty sure it's earth by far

Not if you count regular house windows :)

Poor people in poor countries are way more likely to have an Android phone than a Windows PC, or a PC at all for that matter.

Windows, like in the rectangular shaped openings in your wall.

In a world without walls and fences, you have no need for Windows or Gates.

- A Long Time Ago

> And then you dream of a world with only Windows

- Dio (Master of the Moon)

Microsoft is the master of wait and see approach. There not great at being first. But they are great at competing.

Internet Explorer on mars gogogo

PowerShell for Solar system automation gogogo

The helicopter has a Qualcomm Snapdragon 801. That is probably more powerful than the Perseverance's cpu" 200 MHz BAE RAD750. I wonder if they could offload compute tasks to the helicopter.

It is more powerful. But it has to fly in real time on its own. Rover is on the ground and can do things more slowly without penalty.

They will not. The helicopter is a side project that gets abandoned after 30 days regardless of outcomes, the rover drives away (presumably out of range)

> gets abandoned after 30 days regardless of outcomes

Thats surprising since it is a huge pain to get to Mars. I wonder if it can keep up with the rover if it does one or two flights a day?

They are planning 3-5 flights max in the 30 days.

It’s a technology demonstration. Essentially a throwaway experiment. All they care to prove is whether they can fly on mars. Any extra value they get out of it is a bonus

Also it is partially powered by KDE4 :) https://twitter.com/ivan_cukic/status/1362722727560425476

Most computers on the Earth run Android (Linux kernel != Linux distro) or iOS.

I'd just like to interject for a moment. What you're referring to as Linux, is in fact, GNU/Linux, or as I've recently taken to calling it, GNU plus Linux. Linux is not an operating system unto itself, but rather another free component of a fully functioning GNU system made useful by the GNU corelibs, shell utilities and vital system components comprising a full OS as defined by POSIX. Many computer users run a modified version of the GNU system every day, without realizing it. Through a peculiar turn of events, the version of GNU which is widely used today is often called "Linux", and many of its users are not aware that it is basically the GNU system, developed by the GNU Project. There really is a Linux, and these people are using it, but it is just a part of the system they use. Linux is the kernel: the program in the system that allocates the machine's resources to the other programs that you run. The kernel is an essential part of an operating system, but useless by itself; it can only function in the context of a complete operating system. Linux is normally used in combination with the GNU operating system: the whole system is basically GNU with Linux added, or GNU/Linux. All the so-called "Linux" distributions are really distributions of GNU/Linux.

Actually android is just Linux Linux, no GNU here.

So, what would you say are the most impactful/important parts of "the GNU system" running in modern Android phones?


Linux kernel = Linux.

Linux distro = GNU/Linux

Alpine, for instance is a Linux distro, but is not GNU/Linux.

This is more humorous than any sort of meaningful metric. A bit like bringing a Ford F-150 into the Alaskan wilds and saying "Ford has 100% market penetration in all directions for 1,000 miles!"

:) Also, Mars has gone completely solar and nuclear. No petrochemicals at all!

So no excuse for not being the Year of Desktop Linux on Mars.

It sounds like there are Windows computers on Mars?

Not necessarily. Any positive integer is greater than zero.

No one wants to be in situation like this (: https://www.youtube.com/watch?v=eP31lluUDWU&t=30

Where did you get that impression?

There were presumably computers there before on the other rovers. They either had Windows or a third OS or no OS according to the title.

Given the relatively slow CPUs and limited memory on those older rovers I would expect an embedded OS before Windows. Think QNX or VxWorks, not Windows NT.

VxWorks, specifically.

Some cyclic executives and such, too.

Perhaps when Russian or Chinese probes successfully land. They are both notorious for "borrowing" US software.

Was that not already the case with prior missions to mars? Surely no one has ever sent a rover to mars with Windows running on it.

They sent one but it crashed because mission control were still only halfway through reading the EULA when it reached Mars after 8 months of flight.

It's almost as if, without a human using it, the GUI is no longer a primary design goal.

This statement imply that you cant run a Windows without a GUI which is false. Windows can run fine with 256MB RAM with no display support.

The point is that Windows' biggest advantage is its familiar GUI. The trouble with Linux on the desktop (and in most small business environments where the sysadmins only know how to drive windows server with a full gui installed) is the Linux GUI can kinda suck, and way more people are familiar with the Windows GUI.

Take away the GUI, and Linux becomes a pretty easy choice.

The GUI bit is interesting because my experience is that non technical people can't even navigate windows without extra instruction.

My teenage job was to train hotel / resort call center agents on using oracle tools and even a CLI tool (if you've ever made a Marriott booking over the phone) then the person on the other end has a terminal open and is entirely using keyboard navigation.

Although different workplaces probably give their staff more autonomy which requires their staff to have prerequisite knowledge.

Windows biggest advantage is it's UX which the GUI is a part of.

"Take away the GUI, and Linux becomes a pretty easy choice. " An external machine can become the GUI. It's not because your machine can't render that it can't have an external GUI.

VSCode SSH is popular because it's able to bring a decent GUI to any linux server.

That’s the party line, however Windows with no GUI is extremely awkward to use, and most people just end up using RSAT to interact with it, which is just running the GUI remotely.

All my Windows boxes have a GUI, but for admin tasks I avoid using the GUI as much as possible.

I do a lot of stuff logged in over SSH from my Mac laptop. I run PowerShell scripts. I also have a custom "remote-admin" Windows service (which I wrote myself) which exports functions over HTTP. (I do that to enable/disable my son's laptop account depending on whether he is allowed to use it.) I edit files over SSH using vim netrw. I will still occasionally access the GUI via RDP, but I find myself doing that less and less often.

The purpose to not have a GUI is to be able to run on lighter hardware. It doesnt mean you have to work without GUI.

The point of an OS is to run apps. Given a headless machine, which windows apps would you be interested in running and why would you chose windows over linux? About the only reason in the past has been IIS and MS SQL Server. With .NET core and linux sql server, what reasons are left? An affinity for command.com?

There's a lot of enterprise server-side software that, for better or worse (I'd argue worse), was only developed for Windows Server and will probably always only be developed for Windows Server. So at the very least you'll need some Windows VMs to run these (or maybe Wine, but it's highly unlikely to receive any sort of enterprise support that way, which is kinda the point of using enterprise software).

The good news is that a lot of it is written in CLR-based languages (particularly VB.NET and/or C#), so porting to cross-platform versions of .NET (e.g. .NET Core, .NET 5+, Mono, etc.) is at least theoretically feasible if developers can be assed to do the porting. Barring a solid business case for it, however, it's unlikely.

Point one is that "apps" are a new fangled term for the type of program used by shims over Operating Systems.

Point two is that an Operating System is supposed to operate a system, with or without users. Embedded systems frequently operate without any user input.

Win10 has an IoT version (for some reason), so I assume _some_ people find a use case for it.

In summation I cordially disagree with your assertion, with provided reasons.

> Win10 has an IoT version (for some reason), so I assume _some_ people find a use case for it.

A lot of industrial equipment has embedded control computers running Windows CE. Prior to Windows CE, DOS and Windows 3.x were also popular choices. Go back 20-30 years ago, Linux was much newer and so you can understand why a lot of vendors felt more comfortable with Microsoft's solution. Now Linux is more mature and a more viable option, but many vendors are used to Microsoft-based development and are happy to stick with it.

Microsoft is phasing out Windows CE, but Windows IOT is the simplest migration path for those vendors. Porting software from CE to mainstream Windows is generally straightforward, since the CE APIs are largely a subset of the mainstream Windows APIs; a lot simpler than porting to a completely different platform like Linux.

(Windows 10 IOT comes in two versions – Core, which is a stripped down Windows 10 with various components removed; Enterprise, which is basically the same as Windows 10 Enterprise LTSC, but with a different licensing model.)


Not having to skim the web to find what mean a certain column in the text output of a simple command.

Why would you skim the web, when there are manpages? OFC their quality and usefulness varies, depending on UNIXoid platform.

From my point of view this skimming the web for documentation started with/came from the countless Microsoft users. One still don't need to skim the web with good manpages.

Man pages are not as detailed as ms docs.

I don't remember which command that displayed MAC/stuff like that, but they laked of details, and 1 colum had no info at all.

Meanwhile the whole structure was documented on windows.

Although without display support the name "Windows" feels rather odd.

They run Wind River distributions https://en.wikipedia.org/wiki/Wind_River_Systems

Probably because it’s cheaper and less prone to malicious interference.

They still using VxWorks for the lander and rover ???.

VxWorks still beats it. Linux is not that dominant in space. There is even more dSpace/RTLinux than linux. And windows only for toys.

All responses here are just so uplifting. I click on the news about space on HN just to get a feel-good boost from people’s comments

How do they issue updates to the system on mars?

Very slowly, given the bandwidth is about 3Kbps (https://mars.nasa.gov/mars2020/spacecraft/rover/communicatio...)

I torrented GTA Vice City at a similar speed when I was young. I'm sure NASA has that level of patience.

Is Limewire running on the rover? I hope they don't download an update and it turns out it's a dragon ball amv with a linkin park soundtrack.

They're running Kazaa so there's a risk an update involves horses (yes, LimeWire and Kazaa were both Gnutella consumers)

Where'd you get that?

> The mass- and power-constrained rover can achieve high data rates of up to 2 megabits per second on the relatively short-distance relay link to the orbiters overhead. The orbiters then use their much larger antennas and transmitters to relay that data on the long-distance link back to Earth.

> Transmission Rates Up to 2 megabits per second on the rover-to-orbiter relay link.

And using DSN Now, we can also see that the speed from Earth to those orbiters is also 2Mbps.

The Rover-to-Orbiter is 2Mbps, whereas the X-Band High-Gain Antenna link is 3Kbps:

> 160/500 bits per second or faster to/from the Deep Space Network's 112-foot-diameter (34-meter-diameter) antennas or at 800/3000 bits per second or faster to/from the Deep Space Network's 230-foot-diameter (70 meter-diameter)

I'm a bit surprised the Rover talks directly to Earth for the high speed data transfer. I would have expected a large dish in Martian orbit being used to relay the signal down to the rover's relatively tiny high gain antenna. Even when you account for the fact that the satellite will only be overhead part of the time the link budget calculation would be enormously different.

They do, from the link in the comment you replied to:

> Most often, Mars 2020 uses its ultra-high frequency (UHF) antenna (about 400 megahertz) to communicate with Earth through NASA's orbiters around Mars. Because the rover and orbiter antennas are within close range of each other, they act a little like walky-talkies compared to the long-range telecommunications with Earth provided by the low-gain and high-gain antennas.

Wow! it's a flashback to the dial up days. Still impressive. Look forward to when we are able to advance this technology.

Latency must be crazy too.

1374311 millisecond ping time right now

so roughly Wellington to Reykjavik

sudo apt upgrade --high-gain

May be they can first upload something to an orbiter, and then use it as a secondary source for updates for the computers on the planet?

From what I understand they use the Mars Reconnaissance Orbiter to do exactly that when the rover loses direct line of sight to Earth.

Interesting. It makes sense to do something like that.

Orbital Content Delivery Network

With data uplink? Same way as your mobile phone gets system updates (OTA).

Wonder what the speeds and delay/latency is like? Did they achieve near theoretical limits of light between earth and mars, or was it a different transport layer?

Would love to one day communicate with a person on another planet. Maybe this is something 3 or 4 generations from now will be able to do.

The latency is anywhere from ~5 to ~20 minutes depending on the distance between Earth and Mars at the time. As for bandwidth, there's a couple different answers. Perseverance has two X-band transceivers, one with a high-gain antenna and one with a lower gain omni-directional antenna. It's also got a UHF transceiver it can use to relay communications through the MRO and MAVEN orbiters.

The direct X-band transceivers are pretty low bandwidth and are mainly used for rover telemetry. They're low bandwidth because the the antennas are relatively small and the radios aren't super high powered. It takes the 35 and 70 meter dishes of the Deep Space Network just to receiver and send signals to them. The high gain X-band radio (~8GHz) can downlink to Earth at between 160 and 800 bps (yes bits per second) and uplink between 500 to 3000 bps. The low gain radio is mostly receive only and can uplink between 10 and 30 bps. That's enough for densely packed telemetry data and administration commands.

The UHF (~400MHz) transceiver talks to either the MRO and MAVEN orbiters. Because that link is pretty short range (200-300km) it's much higher bandwidth. The rover can uplink to the orbiters at about 2Mbps. MRO (I'm not sure about MAVEN) is able to downlink to Earth between 500Kbps and 6Mbps depending on the distance between Earth and Mars. Typically the rover will send its mission data (images, sensor data, etc) to an orbiter while its overhead which will buffer it and then relay it to Earth when its view of Earth is the clearest. MRO and MAVEN complete multiple orbits per sol (Martian day) so there's several opportunities for the rover to upload its mission data and get it back to Earth.

All the radio signals travel at the speed of light but the distance is what affects the latency. Mars and Earth are many tens of millions of kilometers apart so it just takes a while, even at the speed of light, to cross that distance. Communicating with another person on Mars would be more like sending each other voicemail messages than anywhere close to a real-time conversation.

The bandwidth is pretty good, considering those super high res photos they send back. There's space in between, and Mars has a very thin atmosphere, so you can communicate with radio quite well I'd bet.

Latency is indeed a major issue, it's about 11.5 minutes per direction. The newest rover actually has some A.I. for that reason to let it drive autonomously and not always have to wait for next commands from Earth.

Latency might be 11.5 minutes now (I didn't check, but I'll believe it), but it ranges between about 2 minutes and about 20 minutes depending on how the earth/mars orbits line up.

It's radio waves. I'm not sure what other transport layers you are expecting to use between Earth and Mars.

The latency is measured in minutes. Obviously something like TCP won't work. Typically they would use something like DTN.


Edit: Corrected hours to minutes. Brainfart on my part.

First we take Mars, then we take Desktop.

"Lizard Gates is sad" - never a true word said in jest.

Yep, somethings you just can't have Billy-boy.

But I'm curious, how many Intel, AMD, Nvidia, or Apple chips?

Can the rover mine some bitcoin and send it back to earth? Would mars bitcoin be worth more than earth bitcoin? Since it's from mars will it not have more sentimental value?

The rover actually can't mine bitcoin because by the time it manages to create a block and sends it back to Earth, the block chain has already moved on and the block is obsolete. I guess this means that Bitcoin is never going to be an interplanetary currency. Paper money is perhaps not so bad after all.

Ok maybe not BTC but can it mine something with a lower level of difficulty?

What it needs is a cryptocurrency with LONGER time between blocks. Say an hour instead of 10 minutes. Dogecoin uses 1 minute block times, if I recall correctly. The light time to Mars round trip is something like 10 to 45 minutes depending on what part of the 26 month Earth-Mars synodic cycle you’re at.

So if the helicopter crashes or otherwise dies after its 30 days lifespan are we back to a single planet that has more computers running Linux than Windows?

Linux to the Mars before Doge to the moon!

I can't help but imagine the rover crash landing on Mars due to a forced windows system update that wasn't able to be delayed.

What a pointless observation

at first it made me laugh, then it made me react like you, then I thought - actually, why is that? And it's really an incredible testament to the success of Linux. Started by a student, built originally by volunteers contributing their time. I feel like (for all its flaws) Linux is one of mankind's great achievements and its deployment on another planet underlines its success.

That we know of



the 90's is asking for its platform wars back

I bet the Linux desktop GUI still sucks.

We need a new model.

Windows should be banned on Mars. That would be a good fresh start.

That's because it was a helicopter and NASA didn't want it to crash.

Quite a stale joke. I can't remember the last time Windows crashed. Windows is probably more stable than Linux at this point, at least from a kernel panic point of view. They do a load of static analysis of drivers and can even reload GPU drivers if they crash without taking down the whole system. That's decades ahead of Linux.

I agree, I can't remember the last time windows crashed. My home PC is a 2014 low-budget build and I don't recall any blue screens of death or freezes.

I often wonder if people run Windows on a potato to account for all the crashes people seem to have.

Anything NT based was a few orders of magnitude better than the old Win 95 systems that gave windows a reputation for crashing. That is what happens when you put some effort into good design. Things have gotten better because Microsoft has learned. Then again, everyone else has gotten better now.

I can't remember either because I'm trying very hard not to have to use windows.

Sadly people are still buying crappy motherboard/laptop that have crappy drivers and blame it on Windows. MSI ethernet drivers sometimes trigger a memory leak that eat up all the RAM in a few seconds. Dell thunderbolt docks are buggy and trigger a BSOD.

> .. I can't remember the last time Windows crashed ..

Short-term memory loss ;]

It’s funny because of the dual meaning of “crash.” It’s possible that OP was not making a literal statement about the reliability of Windows.

If you're some kid running a cracked, stolen version of Windows, on overclocked AMD hardware, sure it will crash a lot. Sadly, that's how many people here on HN experience it.

For grown-ups who stay on the happy path with well-supported hardware and software, it runs very well, and is my primary development environment. When I need Linux, I run it in a VM on Windows 10.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact