And read about how it will use Terrain Relative Navigation to find a safe landing spot: https://www.jpl.nasa.gov/news/a-neil-armstrong-for-mars-land...
Perseverance is phenomenally complex, its Sample Caching System alone contains 3,000+ parts and two robotic arms. So exited for all the sciencing this nuclear-powered, sample-drilling, laser-zapping behemoth can do when it joins its friends on the only planet (known) to be inhabited solely by robots.
Edit: Percy is about to release its two 77 kg Cruise Mass Balance Devices (is this what NASA calls 'weights'?) to setup the right lift-to-drag ratio for entry. Mars InSight will be listening for the 14,000 km/hr impacts of these weights, providing useful calibration data. We wrote about this in this week's issue of our space-related newsletter, Orbital Index - https://orbitalindex.com/archive/2021-02-17-Issue-104/
I'm sort of surprised we don't yet have ML powered "de-accent-ization". His french accent isn't hard to understand at normal speed, but when I set it to 1.5x or 2x speed it becomes hard to decipher in a way native speakers usually are not. If there was just a button (for him or me) to hit to tweak the sounds a bit to reduce the accent, I bet this problem would go away.
Consider responding to the substance of the comment instead.
Notice that, for the case of English, most speakers are not native, by a huge margin! Native English speakers are a biased minority, and with a lot of variation within. Not sure that an "average native" accent is a useful concept at all. I, for one, tend to find most non-native English speakers vastly easier to understand than many native speakers.
Maybe he wants to change the accent to a Texas accent, or the Queen's english, who cares, it's the ML part that's interesting.
I'm well aware, and this does not rebut any of my points.
> I, for one, tend to find most non-native English speakers vastly easier to understand than many native speakers.
That "many" native speakers in a language of hundreds of millions of speakers are hard to understand does not challenge the claim that a non-native accent brought closer to any native accent, much less the mean native accent, will for the large majority if listeners be easier rather than harder to understand.
The way someone speaks is very unique ... and it is actually very, very important how you speak to bring your point across. Or ... to convince people.
A robot voice might present the best arguments, but it will very likely loose to a good speaker who can (literally) tune in to his audience.
Speech is a complex pattern of sound waves, containing much more information, than binary encoded words.
So if there was a ML tool to make people with strong accent more understandable, why not. But you can also numble without any accent.
And I can enjoy and understand certain people with strong accents much better than natives, because they are just good speakers.
And having subtitles is one thing, but changing their voice .. would require consent I believe. (unless you run the tool for yourself, but I believe parents point was, he speaks and then automatically a tool enhances his voice, I would not like that, too)
If the speech recognition / subtitling algorithm can't understand the nuances of the language, that's going to be a problem anyway... accented pronunciation is so multidimensional, you're pretty much going to have to transcribe syllables/phonemes first...
Some of the other videos on this channel are just as in-depth: the ones about the plumes/exhaust of rocket engines as well as star occlusions are incredibly detailed.
They put those in to make the probe seem higher-quality. They got the idea from Beats headphones.
Well no, the Cruise Mass Balance Devices are intended to Balance the Mass of the spaceship during Cruise conditions. That these Devices are single-part and constructed out of a single chunk of metal each should not be construed as merely being 'weights'. :)
Atmospheric drag force center of drag and center of gravity to line up on a same axis, which force the craft to fly slightly sideways if spacecraft isn't perfectly balanced. Done carefully, it leads to direction of flight being slightly sideways, which is awkward but basically same as having lift towards that direction. Add roll control thrusters into the mix, and you get a really crude glider, with fixed pitch force, zero yaw control and barely controllable roll. With JPL-class engineering, such a spacecraft will be capable of actively correcting landing location.
I guess what's surprising is that they needed that much weight (140+kg seems like a lot?) and couldn't redistribute existing componentry; guess the knapsack algorithm wasn't good enough, or that they just couldn't break up enough pieces?
And yes, Cruise Mass Balance Devices sounds like the type of name a tired engineer would come up with to convince upper management...lol
"At 09:00:46 UT Sept. 23, 1999, the orbiter began its Mars orbit insertion burn as planned. The spacecraft was scheduled to re-establish contact after passing behind Mars, but, unfortunately, no signals were received from the spacecraft.
An investigation indicated that the failure resulted from a navigational error due to commands from Earth being sent in English units (in this case, pound-seconds) without being converted into the metric standard (Newton-seconds).
The error caused the orbiter to miss its intended orbit (87 to 93 miles or 140 to 50 kilometers) and to fall into the Martian atmosphere at approximately 35 miles (57 kilometers) in altitude and to disintegrate due to atmospheric stresses."
As noted in other comments, NASA (like the rest of the United States ) does use the metric system.
But it doesn’t matter. Nothing about the metric system makes it uniquely suitable to landing on Mars. Or space travel in general. What matters is a consistent standard.
Internally NASA could use Armstrongs. Where 1 is the weight or height of Neil Armstrong at KSC on July 16, 1969 at 13:32:00 UTC. It doesn’t matter. As long as it is consistent.
But with the metric system you only really get cm (too coarse) and mm (too fine) but you don't get something like 9/16 so you can't "work in 16ths" and have everything be whole units again.
Adjusting HVAC in degrees-C is infuriating to my Fahrenheit sensibilities. 20C is cold, 22C is hot. 21C is probably ok but really I want something like 20.5C. The comfortable range for a room is 3-5 whole units of F, but requires a bunch of fractions in C that you may not even have available on your thermostat.
Sure, converting between units is easy in the metric system. That doesn't make it the best thing to use all the time. Hell, the idea of thousandths of an inch is used commonly, so even the imperial system is base 1000 in some cases. But I've never seen anyone utilize the fractional scale with metric units, probably because the units are the wrong size for that to be useful.
People who use metric units are perfectly happy rounding to the nearest 0.5cm or 0.25cm if that's what's needed, exactly as people do with inches. Why on earth would you imagine people use mm if something doesn't call for them?
Presumably someone who uses Imperial would say 5/128th inches if they wanted to describe something that's equivalent to 1mm?
25mils ~ 1mm
Or 1/16in as others have said.
1000 mils / 25.4 mm = 39.3
This simple fact still seems wrong to me somehow.
Maybe metric users do use fractions and I just don’t hear about it. Is that table one and a quarter meters high?
If you're cutting it yourself, a precision of 1mm is finer than your saw blade or pencil line anyway, so it's plenty enough.
When I hear anything past about 1/8th of an inch my brain shuts down, and I give up.
In reality I use both systems all the time. It’s situational.
> When I hear anything past about 1/8th of an inch my brain shuts down, and I give up.
Realistically, same. 32nds don’t get used outside of some specialty wrenches. 16ths are a practical limit where other scales start to make more sense. Probably millimeters.
I use both systems.
I do prefer Fahrenheit for HVAC (and weather) because it’s higher resolution and has reasonable values at human scales. Thermostats that lack half-degrees-c are never quite right IMO.
So you are one of those, lol. There is nothing "less human" about 25 C than say 72 F. Nothing, it just happen to be the scale you are used to.Both are arbitrary.
> Fahrenheit for HVAC (and weather) because it’s higher resolution
99.99% of thermostats and thermometer in C had at least 1 decimal place. At usual "human temperatures" the difference in resolution between the scales is less than 2X, so even assuming only integer values, I am willing to bet against you in a double blind test that you cannot differentiate 68F vs 69F in an statistical significant way.
> I find it easier to say that’s three eighths than 9mm
Just because you are used to. Fractions are more complicated than integers, every elementary school program knows it.
So to summarize, the problem is not with the magnitude of the units which is arbitrary (a degree F and inches are not more human, logical or normal that a degree C or cm)the problem is with the convoluted way of the imperial system for multiples and submultiples of the base unit.
I guess 20.5 is nice, 15.5 is cool, 10 is cold, 4.5 is really cold, 26.5 is hot, 32 is really hot and 37.7 is dangerously hot. It’s fine if you are used to it but I don’t really see a benefit.
I was in a hotel room in Japan that only had whole unit adjustments for the A/C. To get 20.5C I had to switch to Fahrenheit. I guess I was unlucky.
I find distances in metric and imperial perfectly usable and use both regularly.
As outlined in detail elsewhere in the thread there are advantages to working in fractions in some situations. Specifically when using a ruler or tape measure with different markings for 1/2, 1/4, 1/8 and 1/16. There’s no reason that has to be unique to inches, it just works out well in some cases.
Or, you know, 20, 15, 5, 30 and 40 instead of the arbitrary decimals you chose to use to prove your point
To the identical 5 degrees range of the Celsius scale ?
If I need to take measurements while boiling water or making ice then I would probably use C.
> How do you divide 7"3/8 by 5?
Same way I divide 4.7625 cm by 5. With a calculator.
25, yes. It's not too hard to do the math.
> Same way I divide 4.7625 cm by 5. With a calculator.
That's roughly 0.95 right by intuition, but (7"3/8) / 5 doesn't come easy to me.
When more precision is needed, so easy to go to the 32nd
That's not realistic, obviously, so we just pick one. The units in the system are arbitrary, really.
In reality regardless of the system you choose every calculation is going to end up with fractions of something. You aren't just going to do it in your head.
For example, you could define mars units where the gravitational acceleration on mars is 1. Now your velocity in freefall is just equal to the time you've been freefalling! You don't even have to do a calculation!
(note: Don't actually do this. Gravitational acceleration isn't a constant when you're doing orbital mechanics.)
My experience with U.S. students is that they are having a much harder time making sense of the imperial system (that they are used to) than doing problems in metric, even though they don’t use it in everyday life.
First off, you linked to a list of english measures which are not used in the US. Nobody uses fathoms or barleycorns.
Here is the list of actual US customary units: https://en.m.wikipedia.org/wiki/United_States_customary_unit...
Second, none of that is relevant to landing on Mars.
The only problem space where metric has an advantage is in converting between meters, kilometers and millimeters.
That’s great, and it’s easy to learn. But it doesn’t suddenly make all problems of distance easier to solve.
If I am traveling toward Mars at 47 meters per second it doesn’t help me to know that is also .047km per second. And converting to kilometers per hour involves using base 60 twice anyway because metric time is unwieldy.
In reality none of your measurements are going to be nice round numbers. Mentally converting from meters to km might be nice sometimes but it’s essentially a party trick.
It won’t help the lander make decisions. The hardware doesn’t inherently work in base 10.
Does NASA mix meters and kilometers? Isn’t that the same problem that destroyed the Mars Climate Orbiter?
The fact is the units are irrelevant beyond just being defined and used consistently.
Also, I can’t think of a situation where I need to convert miles to feet. My bike ride is six miles, I’m never going to express that in feet. If I need to describe the size of a thing in a room I will probably use feet, maybe inches if it is small. Probably not feet and inches. I wouldn’t use miles at all. Easy conversion between those units just isn’t a problem that comes up. It’s more important to me to have reasonably sized units and that the person I am communicating with understands them.
How many pounds does a cubic feet of water have?
How many BTUs do you need to heat 10x10x3 ft water pool 20 degrees F?
How much work in ft-lb is done by gravity when a 10 oz mass drops from 19 yards?
How many HP are needed to rais 2400 lbs 74 inches in 30 sec?
It is obvious you have 0 experience doing back-of-the-envelope calculations for scientific or engineering purposes. It is a no-contest between the metric and the imperial systems.
Yes it does. It means you can immediately sanity-check your numbers even if you don't have a good sense of what meters and kilometers are, because you have that base/kilo relationship.
> My bike ride is six miles, I’m never going to express that in feet.
You can eyeball how fast you're going in feet per second and have a rough idea of how long your ride is going to take. Or rather you could if you had any idea of how long your ride was in feet. There are lots of little everyday things that just become much easier.
I estimate my bike ride progress in landmarks and time. Not feet per second. Did I get to the boat ramp in 20minutes? Better speed up and get to the park by 30.
But you'll have small distances and large distances and pieces from external sources who use measurements on a scale that makes sense to them. You can make your external sources do conversions themselves, but that's just moving the problem around. There will usually end up being a point, or probably several points, where you have to relate a small distance to a large distance, and wherever that happens, a human sanity check is a help.
> I estimate my bike ride progress in landmarks and time. Not feet per second. Did I get to the boat ramp in 20minutes? Better speed up and get to the park by 30.
Precisely - you have no sense of the relation between your speed and how far you can go, because you're using a terrible measurement system, and you don't even notice the how that's robbing you of the ability to develop useful intuitions.
Miles per hour is literally a measure of distance over time. If I wanted to use my GPS I could very easily determine how far I can go in a given amount of time. I can do this equally well in the metric or imperial systems, without converting to feet or meters.
The hardware doesn’t think in base 10, but having more than that in imperial makes it better?
Your document lists 12 mass units alone. I rest my case, what could possibly be more logical, convenient, and need less conversion.
If communication was your major goal, then the system that is used by 7.3 billion people on this planet would be your choice.
Great, we agree.
> Metric on the other hand sucks because none of my measurements will ever be nice round numbers.
Depends on the situation. Metric units can be useful.
> The hardware doesn’t think in base 10, but having more than that in imperial makes it better?
No, it means neither system has an advantage so just pick one. Or invent a new one that allows better hardware utilization.
> Your document lists 12 mass units alone. I rest my case, what could possibly be more logical, convenient, and need less conversion.
I don’t convert, I just pick the unit that fits the problem.
> If communication was your major goal, then the system that is used by 7.3 billion people on this planet would be your choice.
Yeah I use the metric system all the time. Just like NASA.
But one of the systems does have an advantage because it stays in base 10, whereas the other doesn't.
>> Your document lists 12 mass units alone. I rest my case, what could possibly be more logical, convenient, and need less conversion.
>I don’t convert, I just pick the unit that fits the problem
But you can't if you just use the `intuitive unit', and that's the whole problem. How would you measure the amount of liquid fuel in, say, the small tank for an attitude control thruster of some probe? How does that add to the overall mass of the whole probe? Or to the force you then need to accelerate it by a certain amount? And now compared to the whole launcher?
In which units do you measure everything going on in a small wind tunnel model, and how do you compare that with the real thing?
Under which conditions do you go from fluid ounces to ounces to cups to pints to quarts to gallons (also note that, again, you not only switch units but bases)?
>Yeah I use the metric system all the time. Just like NASA.
Good for you, it solves all the problems.
That's a benefit to humans, not to hardware, which was the context in which I was speaking.
> But you can't if you just use the `intuitive unit', and that's the whole problem. How would you measure the amount of liquid fuel in, say, the small tank for an attitude control thruster of some probe? How does that add to the overall mass of the whole probe? Or to the force you then need to accelerate it by a certain amount? And now compared to the whole launcher?
Honestly? I'd probably measure it in volts. That's what the hardware is doing after all. That's my point, it doesn't help the computer to convert to base 10 and do calculations that way. Fuel level is measured in volts using binary. For a human something like grams probably makes more sense so sure, display it in those units. But that's a conversion.
> In which units do you measure everything going on in a small wind tunnel model, and how do you compare that with the real thing?
Again, volts on strain sensors. Maybe analog or maybe binary, in newtons. Again, the hardware doesn't think in units humans prefer. There has to be a conversion that doesn't use simple in-your-head math.
> Under which conditions do you go from fluid ounces to ounces to cups to pints to quarts to gallons (also note that, again, you not only switch units but bases)?
Cups, pints, quarts and gallons are all based on the ounce and powers of two. A gallon is 128oz, a half gallon is 64oz, a quart is 1/2 of a half gallon (or a quarter gallon) or 32oz (also, approximately a liter). A pint is half a quart or 1/8th of a gallon or 16oz, a cup is half a pint or 1/16th of a gallon or 8 oz. These fractional scales are really handy for converting between units in some situations. The unit fits the task at hand or you can trivially double or halve the size of the unit if needed. It's the same fractional scale and math used with the inch.
I have gone much farther here than I normally would. I don't think I can constructively explain this to you any further.
Using one universally accepted system is core idea behind metric system. Now, it looks like it is competition between two equal systems, but historically it is competition between ideas 'we should have one universal system' and 'every country/area can stay on their local systems'. Just all other legacy local systems (outside u.s. customary) disappeared.
With metric it's a matter of shifting the decimal.
How much is a sixteenth inch anyways.
A 16th is half an 8th. Twice as much as a 32nd. AKA 2^4, 2^3 and 2^5, respectively.
Do you think kids find it sexy to talk about meters, kilograms and degrees Celcius rather than feet, miles, pounds and Fahrenheit?
I can't fault for choosing what is more understandable to the target audience.
During the stream, you can hear the various teams giving measurements in metric, whilst the media gave coverage in imperial.
It's a pretty interesting video from that perspective, as you can hear the two "realities" being translated for the intended audience.
So basically TERCOM from cruise missles but used on space crafts? All you need is a radar countour map of the area and it can automate it's way to the endzone.
Is it the idea that life could originate elsewhere and that there might really be aliens?
Or is it the idea that Mars could support some sort of colony?
Or the hope of completely novel microbiology?
Any one of these things would be a massive boon to our understanding of life throughout the solar system and broader universe, right down even to here on Earth. All three of them would arguably mark a new era in Earth's history.
The good news is that a substantial chunk of the world's cargo transportation runs on diesel (or other combustion setups similarly not reliant on electronics), so in a pinch it could probably keep going. Same with agricultural machinery. Might need to replace or refurbish some ECUs, but I'm sure there are enough clever mechanics out there willing and able to bypass those in an emergency that we'd be back up and running pretty quick on that front.
ICE 1, electric 0 ;)
It's refrigeration that'd have me more concerned, since pretty much all modern refrigeration is electrically powered (last I checked). Diesel generators might come in clutch there, assuming the refrigeration units themselves don't rely on any fancy electronics.
Worst case it sets us back to 1870ish, maybe. Depends on how fast things go to crap vs how fast things can be rebuilt.
Likely case you'd basically get a "purge" because society as we know it can't keep on rolling with the kind of economic breakdown something like that would cause so there's be a lot of dying in the interim but if you don't starve or get shot in the first 6mo you're probably good with the very old, very young and unproductive bearing the brunt of it (same as every other disaster) It would be like the black death, but global and all at once. Balance of power globally would definitely be altered in unforeseeable ways but the overall net result is things would bounce back hard.
Yeah, but those individuals were presumably all in pretty close proximity to one another. If we were left with a few thousand individuals across the entire range of the human-inhabited Earth, we'd have one heck of a time continuing as a species.
In any case, the risk of an extinction event on Earth is exactly why I believe space colonization needs to be Priority Zero for humanity, from two different angles:
1. Living beyond Earth means that we as a species are that much more resilient against a literal-Earth-shattering catastrophe (and if we can get the bulk of Earth's current/future population off of Earth, then we might very well be able to avoid a couple different plausible extinction events).
2. If we can colonize entirely inhospitable worlds like Mars or the Moon (or my votes, Ceres, Venus, and Enceladus), then "colonizing" Earth is easy-peasy-lemon-squeezy even if it does become Venus 2: Greenhouse Boogaloo.
> If we find Mars life, it would not at all be surprising to learn it is related to Earth life.
If it's DNA/RNA based, we might actually be able to determine the relationship and whether that's true or not.
Or we're just ahead of the curve.
Maybe my question makes more sense in the case of "X is happening right now", because then I should either understand "we infer that X should have happened right about now" or "we have confirmed via signal that X has happened", and that's a big big difference.
I know in some cases they explicitly say the latter, so I guess my real real question is, do they just keep the communication delay implied in all countdowns & references in discussion, to avoid confusion?
(ETA: No need to let me know about simultaneity problems in relativity — earth and mars are, relative to c and to macroscopic time scales, essentially not moving relative to each other AFAIK, so that simultaneity is essentially well-defined. My question was about a much more boring classical-universe problem.)
Ah great, that's a great phrase to make everything clear and provide a kind of "frame of reference" to think & communicate in. Always need these abstractions.
Still a very nice yet nerve wrecking idea to do it like this - you know the lander is on Mars now. But is it safely on the ground or is there a third Shapirelli crater now ? You don't know! A huge relief in the end. :)
On the other hand, if you were on Earth and I was in between Earth and Mars, I would receive the data more quickly than you, and I could even watch it whiz by me on its way to you. The thing about relativity is that it’s... relative!
According to the energy-time uncertainty principle we don't even know when exactly the RF waves that transmitted information hit the receiver on Earth either.
See for example  and .
(Not trying to be judgy. People just seem to forget about the gender neutral use of "they".)
I found it actually fairly annoying at first, but quickly got over it because it is related to the plot. Same for Sanderson, I've read all his books and am not even sure now which one you're talking about!
Sometimes we need a nudge to notice the unconscious bias around us, and it's often uncomfortable.
The whole genderless thing was completely irrelevant to the plot—the diones could have been male or female and little else would have changed. It's not that it made me uncomfortable, but it was an unnecessary distraction.
Anyway, the book was a 2/5 because it threw almost everything from book 1 away, had way too many characters and places, ended on a serious cliffhanger, and the fictional science was extremely far-fetched. Meanwhile, book 1 was fantastic (a full 1/1 in my opinion) was focused on a small cast of characters on a small set, had great character development, and resolved nicely.
How is this offensive to anyone??
But it seems like you're saying there's some kind of complex relation where sometimes people don't get to choose their own pronouns, but other people get to choose which one out of many to use based on convenience. Maybe it would help my understand if you could provide a chart relating the pronouns someone chooses with the pronouns other people are then allowed to use?
I think this is everything wrong with the world currently. Provide you with a chart, so I can justify using gender neutral pronoun?
This is barely more than a year old, is it already outdated? I am an asshole for daring to use “they”?
This matches my own understanding; I have no idea why you are referring to this article as if it supports your bizarre crusade to misgender people.
The comment that you replied to:
> You can just say "they".
Was a reply to this:
> S/he already did. We just don't know yet.
Which was talking about the mars rover.
And even that is not correct. Events happening propagate with speed of light - the event horizon. We can predict we will receive information of something happening, bit that's just prediction about future events, regardless of the location.
not in the true accurate to the picosecond sense of the word, no, but the exact word simultaneity is used when discussing number and density of satellites about a given latitude/longitude in the starlink beta program. Since they're LEO and orbiting at only 550 km, the satellites above a given spot on the ground vary greatly in the not-yet-complete sparse network.
Usually related to discussions of whether a beta test customer terminal will briefly hiccup and lose connection to its default gateway, or if somebody is at a sufficiently high latitude that they can have full coverage for all 86400 seconds in a day.
https://satellitemap.space/ has a good animated visualization of this.
Apparently the latency time is currently 11 mins 22 seconds -- which is somewhere near the average. It goes from under 4 mins to over 22mins depending on distance.
With your personal light cone, it's fine to equate "now" with what you see in the moment. It just has to be clear what you mean for situations where communication might be ambiguous. If you have a person on mars, be sure to be precise what you mean when you tell them to do something in five minutes, when they receive the message they won't know if you mean five minutes after they receive the message or anywhere between 17 minutes before and 2 minutes after they receive the message.
When you get into relativistic speeds (and especially very short time intervals), nobody can even agree on when something "actually" happened, different observers have different opinions about what happens when even after you account for light travel time.
And there is no concept of now in a significantly distant location.
Related video: https://youtu.be/pTn6Ewhb27k
Put another way, simultaneity is perfectly well defined in a single inertial reference frame, and for purposes of my question, earth and mars can be considered to be relatively motionless.
Simultaneity is not “perfectly well defined in a single intertidal reference frame”. That is just a convention.
If the RTT of earth to Mars is 20 minutes, then we can say that it takes us 20 minutes for our message to reach the rover, and the rover’s message arrives instantly, and that’s a consistent definition of simultaneity.
Your link points this out. You can play these games; I don't dispute it. But it's a separate matter entirely from anything to do with relativity, as your link points out, which is itself separate from the classical problem I originally posed. So we are now two steps removed from anything relevant to the Mars rover; I guess we get a sense of pride and accomplishment?
Looking forward for the next landing in May of the Chinese rover and all the science these robots will produce. Also, the test of Ingenuity, the helicopter, will be very interesting to watch, that could really pave the way for a different exploration style in the future.
And finally, maybe the next transfer window will already see some Starships, that would really change everything.
What would you ballast a starship with for the practice missions?
Useful materials which might survive a RUD and aim for someplace near a likely landing zone? If you crash the parts of a milling machine, a lathe, some tooling, some assorted metals stock, and a bunch of assorted wire, well sure you just cleared out a machine shop auction, but maybe there comes a day when an early Mars colony would be thrilled to go clean up your “landing” site.
Basically all of the history of science until 200 years ago was figuring out "mine and extract" and the course of civilization is very much linked with the price & quality of metal structures they could produce. But it's gotten so good we take it for granted.
However that is because of gigantic plants situated in specific areas where energy is cheap that do this thing at an amazing scale.
Aluminum is cheap as chips, except that it used to be more expensive than platinum (and at a much higher impurity ratio than the stuff we use for baking or for making cheap cases).
Heck, gold is "a thing" because we could purify and mold it without bringing it to a melting point and it was, for a very long time, the only metal available to us to do anything with, way before the bronze age.
And the problem with the refinement process is that you can't really "be smart" about it, reaching very high temperatures is one of those things you can't really scale down in an efficient way. You'd have to propel 100,000 tons of factory to mars in order to efficiently refine anything remotely close to the metals we had access to 100 years ago.
Which is not to touch on the mining bit, that is in itself very complicated (see how slowly and shallowly rovers are currently able to drill).
Are there workarounds for this? Maybe, I don't think anyone knows them though, they are not the kind of thing that's within easy reach. Maybe if we happen to stumble upon large reserves of bismuth or lead or gallium or mercury close to the surface of Mars, and build a whole branch of engineering around using those to build machinery... ? But my limited knowledge of geophysics and geology tells me that finding those in large amounts is very unlikely.
For reference, if you take an oven, that can reach, say, 450 degrees celsius (home) and up to 700 (industrial). Those aren't enough to refine any "useful" metal (e.g. iron) and building them requires materials that were produced at 1500+ degrees.
IANAChemist/IANAMaterialScientist/IANABlacksmith though, so take with a spoon of salt.
In the limit, If something takes 5,000 watts, you could run it for a few minutes/day, with that same RTG, provided you had suitable energy storage.
Perhaps they could gather grains of material, and just sort them, one at a time, only keeping the iron rich material, or use a permanent magnet to gather ferrous material. You could sinter the grains together using a microwave or laser pulse.
The results don't need great quality, just enough tensile and compressive strength to be mechanically stable during additive or subtractive manufacture.
Lots of minds have been thinking about refining metals for a very long time, but they haven't been thinking about doing it on Mars, with limited power, and very far outside the box of normal constraints, like cost.
This is one time capitalism doesn't apply at all... and most solutions assume capitalist incentives and costs, instead of going back to first principles thinking.
But thinking about it now, I can't envision that we here would be able to come up with something sublime/novel that a huge army of really-smart people haven't already after spending decades thinking about it. Then again, we have a lot of smart people concentrated in this forum, so who knows if a weird/silly conversation triggered by IT-minded people, acts as a catalyst for the engineer-lurkers that see it.
And then you land your digger, at a cost of at least a couple 100 million, and you have a dumb robot that digs a small hole?
We should do industrial build up, but for me that starts with solving cheap transportation from earth to Mars first. Then you can hope to bring things like nuclear reactors, that can actually produce the power you need significant work.
Currently we are going for science reason and very few people are even working on going with humans or industrialisation.
Everything happened correctly. :)
That is Rob Manning, an absolute legend! Here is an interview with him from a few years back: https://solarsystem.nasa.gov/people/2280/rob-manning/
He also wrote this great book: https://www.amazon.com/Mars-Rover-Curiosity-Curiositys-Engin...
Can't wait till they start posting raw images :)
It's very high res. You can see the holes/damage on the wheels -- Perseverance will have new wheels because of it. And also, won't have the 'morse code spelling' on the wheels either. It's amazing that this kind of damage/wear couldn't have been predicted in tests.
The amount of dust that has settled on top in what appears to be predictable channels is also interesting.
Looks like they have more than 300 000 images on the raw site:
When you do computer vision, the first step you do is convert your color image into a black and white image, and run your CV algorithms on the black and white image. This is because when you're looking at objects and shapes and stuff, it's contrast that tells you where the boundaries between things are. This is true even in a human world of human objects, which tend to be many colored. It's even more true on Mars where basically everything is varying shades of orange. So having color doesn't help a whole lot, and you also have to do the additional step of converting the color image to black and white, which takes CPU power and adds latency. Remember, the purpose is hazard avoidance- latency is bad.
Additionally, color camera sensors aren't actually color sensors. They're black and white sensors. In front of every pixel on the black and white sensor is a filter that is either red, green, or blue. Pixels are grouped into sets of four, and there are two pixels with green filters, one pixel with a blue filter, and one filter with a red filter. (sometimes one of the green filters is omitted, giving red, green, blue, and b&w, or sometimes one of the green filters is a filter that allows IR, or something like that.) So if you have a 16MP camera, the camera has 8M green, 4M red, and 4M blue pixels. This means two things; first of all, if you just wanted a black and white image in the first place, a color sensor gives less detail than the equivalent black and white sensor, and second, you need to do additional processing to convert the raw output from the sensor into an image that's usable for anything. The additional processing adds latency.
I have a feeling I'd be the angry guy in the meeting who wouldn't accept the consensus. "but what about latency! what about the descend and landing!" shakes fist
-Worked at JPL for a few years and have dozens of friends, a few in the vision system.
Long answer: Colour is a very rabbithole topic but Captain Disillusion has a summary of it (https://youtu.be/FTKP0Y9MVus) and Technology Connections has a discussion (https://youtu.be/uYbdx4I7STg).
We can notice that when people say they perceive "yellow" that the spectral intensity graph has certain patterns. This is the physical phenomenon that produces the sensation of "yellow."
Humans are not good at judging reality introspectively. We experience everything heavily filtered through a variety of lenses. Our feeling that color is "concrete" is not predictive or explanatory... we cannot build mechanisms based on it. The idea that our perception of color is a result of interactions between certain wavelengths of light and certain photosensitive tissues in our eyes is both predictive and explanatory. We can design systems that have similar types of wavelength intensity sensitivity components and measure the physical response of those systems. That's how cameras work.
We can reverse the process and take those measured wavelength intensities and re-emit them from variable-wavelength light sources and produce images. That's how you're reading what I've typed right now - the images produced by the display you're looking at were generated in this fashion.
I'm not sure what you mean by the “wavelength theory” of color perception.
Of course we can. We can capture the signal sent through the optical nerve and then reproduce it as a stimulus which will make the brain “see” yellow color.
Besides, humans are capable of distinguishing literally millions of colors, of which just a tiny fraction can be attributed to measuring particular wavelengths (or, more accurately, particular energies of the incident photons). In that way the eye is different from the ear (which performs a kind of Fourier analysis of the sound wave).
I agree that there are sensory perceptions humans are capable of perceiving and labeling as colors that cannot be attributed to external physical phenomena, but those are largely artifacts of the way our brain processes signals. For example if you stare at a purple dot for some time, then look away, you'll perceive a yellow dot where there is no external set of photons corresponding to the wavelengths that normally trigger the sensation of yellow striking your retina.
This is just more explanation about how "yellowness" is a characteristic of our brains, not of the external world.
Or did you mean something other than what I'm referring to here? I think that for the vast bulk of humans, the vast bulk of the colors they perceive regularly are due to photons striking rods and cones in their eyes at various intensities, causing color sensations to occur in the brain.
Do you think something else is happening?
You seem to understand how the eye works, and some neuroscience, so I don't understand how you can have the questions that you raise about whether we can build cameras that sense "color" instead of "light"
The human eye has four basic cell types, rod cells and cone cells, and there are three subtypes of cones, short, medium, and long. The three subtypes of cone cells sense blue, green, and red light more or less directly. Medium and long cone cells, which directly detect green and red light, almost entirely overlap.  It is more accurate to say that long cone cells detect yellow light than it is to say it detects red light. There is a brain system which measures the difference in response between the long (red) and medium (green) cells and uses the difference to say "aha! this must be red!"
The ratio of short (blue) medium (green) and long (red (yellow)) cone cells are roughly 2%, 2/3, and 1/3. The cells in your eye which detect blue light are more or less a rounding error. The cells which detect green light are roughly twice as numerous as the cells which detect red (well, yellow) light. If you see a thing and think, "man, that's awfully blue," it's not because your eyes are telling you "hey, this thing is awfully blue". The "blue" signal is barely noticeable in the overall signal; but your brain jacks up its responsiveness to the minuscule blue signal.
One of the side effects of the completely fucked ratios between the three types of cones is that your perception of the overall brightness of a thing is mostly down to how green it is. This shows up in lots of standards; NTSC, JPEG, the whole nine yards. If you've ever implemented a conversion between RGB and any luminosity-chroma colorspace (YUV, YCbCr, YIQ, NTSC, any of them) there's a moment where you'll go "wait a minute this doesn't make any fucking sense". You look at the numbers and the luminosity channel is just... green, and you know that the other two chroma channels are quartered in resolution. And you'll think that makes no sense. But that's how it works.
Then you'll remember that color sensors have their pixels arranged in groups of four, with two green, one red, and one blue channel. There must be some green conspiracy.
And there is. It's your brain. It's your eyeballs with 2/3 of its cone cells being green sensitive ones.
Those are your cone cells. Rod cells are entirely different. It's trivial to say well, cone cells see color, rod cells see black and white, but it's more complicated than that. Rod cells are excellent in low light conditions, cone cells not so much. Cone cells see motion very well, rod cells not so much. Cone cells can discern fine detail, rod cells do not. Rods and cones are not evenly distributed across the retina either; cone cells are densely packed in the center, rod cells are more common in peripheral vision.
Look at a colorful thing directly; take a note of how colorful it is. Now look away from it, so it's only in your peripheral vision; take a note of how colorful it is. Does it seem just as colorful? It isn't. That's your brain fucking with you. Your brain knows it's in your peripheral vision and all the colors are muted out there, so your brain exaggerates the colorfulness. Cone cells are 30 times as dense in the center of your vision as they are just outside the center of your vision.  That's why you can read a word directly where you're looking but it's very difficult to read elsewhere.
The reality is that your retinas give a fucking mess of bullshit to your brain, and the brain is the most incredible image processing system conceivable. It takes bullshit that makes no damn sense and -- holy shit I forgot to talk about blind spots.
Ok, so your rods and cones have a light sensitive thing, with a wire in the back, and all the wires get bundled up in the optic nerve that goes to the brain. Here's the thing: they're fucking plugged in backwards. The wires go forward, and are bundled up between your retinas and the stuff you're looking at. The big fat optic nerve therefore constitutes a large chunk of your vision where you can't see anything. Your brain just.. invents stuff where the optic nerve burrows through your retina.
Other weird stuff. If it's bright, the rods and cones send no signal, if it's dark, they send a strong signal. It's inverted. There's apparently a very good reason for this but I don't remember what it is. Also, the rods continuously produce a light sensitive substance that amplifies the light sensitivity but is destroyed in the process. It takes a long time to build up a reserve. This is why it takes time to "build up" your dark vision, and why it's so easily destroyed by lighting a cigarette. The physiology of "ow it's bright" as opposed to "it's bright" isn't just on your retinas, it's also on your eyelids and your iris, but more importantly, it's shared between your two eyes. This is why closing one eye makes it less painful when you go from a dark place to a bright place.
The point is, the study of human vision is not the study of the human eye. The study of human vision is the study of the human brain.
Much of what we do with color spaces and image compression is dictated by our stupid smart eyeballs and our stupid smart brains. Video codecs compress with 4:2:0 chroma subsampling because the brain's gonna decompress that shit better than a computer can anyway. Cameras have twice as many green sensitive pixels as blur or red pixels because the eye resolution is much sharper in green than other colors. More advanced image and video compression schemes will try harder to account for human eye-brain physiology.
The reason is to prevent light fatigue in eyes. Ears and nose experience a quick fatigue when exposed to the same stimulant for a long time. With inverted arrangement in eyes, you have a naturally stimulated inhibition rather than a fatigue inhibition.
After you get done exploring how we perceive colors associated with different wave lengths of light, and how nobody really knows whether these are common somehow, or unique to each of us, that sentence should bring you both a chuckle and some wonder about perception.
I am inclined to believe it is, but we do not really know.
The lower "HazCams" hazard avoidance cameras (which captured those initial photos) are there to detect hazards (rocks, trenches, etc.). They are stereoscopic, lightweight, and high resolution.
My guess is that using color sensors would have either increased the 3D mapping precision or added weight/power/bandwidth requirements, or otherwise been less robust in that environment.
Those cameras were also pre-deployed for the landing phase and likely transmit more quickly due to the lower data information. The other cameras were shielded for the landing phase.
The navigation and other cameras are in color, and I expect we'll be seeing better images shortly.
 This comes to mind whenever a question like that is asked: http://4.bp.blogspot.com/-CWM1zDcmWXs/TroD0VsX4WI/AAAAAAAAAV...
I think you meant to say decreased? In which case I think you would be correct! Camera pixels are made up of these things called photosites which don't by themselves record color, only brightness. In order to record color information, the photosites are placed behind a Bayer filter, which effectively reduces the resolution of the camera by 3, because in order to get the color of a pixel you need its red, green and blue component. Bayer filters also frequently have a small blurring filter in front of them to make sure that nearby photosites with different color filters get the information they need.
If you're looking for the highest resolution image possible, black and white is the way to go!
That way you get high regulation as well as color. You can also have some special (infrared, ultraviolet, etc.) Filters on the carousel, not just RGB.
and BAM, false color! FTFY
I did, thank you. I think my brain had already skipped ahead to the added weight/complexity concept while my fingers were stuck on that part of the sentence.
I should probably read things after I type them...
What are they going to do next ? Put on board a solar powered Mars helicopter ?? ;-)
These are hazard cameras, designed to be inputs into the guidance algorithms on board. It might make sense for such a camera to be B/W to reduce on board processing required. There's also a glass cover on them, and a lot of dust from the landing, so that may be obscuring true color if the cameras do in fact take color images.
Also they may have just transmitted a lower quality B/W image to get something back to Earth quickly, since higher res images take longer to uplink.
also is it technically correct to call the Martian atmosphere "air"?
Flash Gordon (1980) Goofs
At the very beginning of the film, Ming and his henchman are discussing "an obscure body in the SK system", which the inhabitants refer to as the planet "Earth", pronounced as if the word is completely foreign to them. However, at that moment, Ming activates a button on his console labeled "Earth Quake".
> the mixture of invisible odorless tasteless gases (such as nitrogen and oxygen) that surrounds the earth
> also : the equivalent mix of gases on another planet
I would naively guess yes to part one but it's complicated: Mars has less gravity, much less atmospheric pressure, colder temps, and greater gravitational influence from its moons than Earth. Wikipedia says the mechanism of the planet's dust storms isn't well understood.
Also doesn't help that there is a (transparent) lens cover in front of the lens obscuring the view.
Elon Musk needs to provide some Starlink sats for a better connection.
What I could imagine is having Starlink satellites around Mars that allow to route data from rovers anywhere on the planet to a dedicated high-performance communications platform that handles communication with Earth.
It's just that since there have never been more than a handful of spacecraft active on Mars at any given time, there's currently no point in spending huge amounts of money to launch a whole constellation of satellites for continuous coverage.
"The data rate direct-to-Earth [from Mars] varies from about 500 bits per second to 32,000 bits per second"
> 160/500 bits per second or faster to/from the Deep Space Network's 112-foot-diameter (34-meter-diameter) antennas or at 800/3000 bits per second or faster to/from the Deep Space Network's 230-foot-diameter (70 meter-diameter)
for high-gain antenna, and
> Approximately 10 bits per second or faster from the Deep Space Network's 112-foot-diameter (34-meter-diameter) antennas or approximately 30 bits per second or faster from the Deep Space Network's 230-foot-diameter (70-meter-diameter) antenna
for the low-gain antenna, which I believe the first two images were sent through
It seems that NASA is being awesome and making all raw images available as they get them. So far just the 2-ish.
Guessing its black and white/high contrast to help see rocks etc. And probably much lower res, smaller file size too for transferring.