Although our error rate was more like 1-2 per week on the equipment I was looking at.
I wonder how much of that was cosmic rays and how much was just less foot traffic resulting in fewer errors.
It's also the reason why all ISS footage has dead pixels on it.
This is somewhat similar – although on a much different scale – to when Anatoli Bugorski had an accident in 1978 involving a beam of protons traveling at over 99% the speed of light going through his head. He also described it as a flash "brighter than a thousand suns".
Still one of the most audacious things I've seen, but he was a good physicist and did the math right, so he wasn't being unsafe.
I guess I'm just noticing his name more?
This can involve things such as using digital circuits that are radiation resistant (e.g. look up radiation resistant flip flop). Using multiple computers running the same thing that all "vote" for the correct result, so if one computer has an error from radiation you don't suffer. Using semiconductors that are more resistant to radiation (larger band gaps mean more energy required to flip a bit).
Physical shielding is key as well. The infrared imager on the Cassini probe had a case made out of tantalum, as tantalum is a very dense material which prevents a lot of radiation from going through it.
So this allows removal of effects that do not correlate with what is physically should be in the image, but is an artifact of the sensor, image system, etc:
-- sensor artifacts: dead pixels, flat field irregularity, pixel response variations, electronic noise
-- imaging system issues: optical problems, lens/mirror defects
-- and then exactly what's being discussed here: cosmic rays, transient objects (satellite tracks!)
Modern camera firmware detects stuck and dead pixels and tries to fill them in with neighbor data., but when there's too many... there's not enough data to fill in.
Keep watching top left part of the video. Most visible at 0:30.
Cosmic rays are just very potent photons, capable of knocking out electrons from an atom (meaning they are ionizing) causing havoc in precision electronics and, well, our DNA.
In my opinion it's a bit silly argument, as there's a whole bunch of other risks and quality of life sacrifices made by the persons who are going to undertake that journey. Some raised chance of cancer is probably the least of their worries.
"NASA told Business Insider it estimates there’s a 1-in-276 chance the flight could be fatal and a 1-in-60 chance that some problem would cause the mission to fail (but not kill the crew)"
That's fairly low - feels like the age of sail would be far higher -- Columbus's second voyage alone had about 25 deaths just from scurvy. His first voyage was only about 90 crew.
But remember that life wasn't exactly easy on land either.
The model commonly used in radiation protection to assess cancer risk with respect to radiation dose is called the Linear No-Threshold (LNT) model . The model critically assumes that (1) total radiation dose is the only predictor of cancer risk, and (2) any radiation exposure results in an increased cancer risk.
This model works at high absorbed doses, however, its applicability is highly controversial when used with low abosorbed doses or with relatively high absorbed doses that occur over a long period of time (ie: low dose rate).
The thing is, the human body has built-in defense mechanisms against cancer such as DNA repair. There is a good body of evidence that small doses and low-rate exposures do not result in cancer risk (ie: there is a threshold absorbed dose and probably also a threshold dose rate), but the model does not account for this.
This is particularly problematic when trying to assess excess mortality from things such as radiological accidents: when you multiply the small LNT-predicted risk for a low dose times a very large population, you end up with a lot of cancers. This is one of the reasons you'll see estimates for deaths from the Chernobyl accident vary by orders of magnitude.
It's also problematic when assessing something like a Mars mission: yes, the astronauts would get large cumulative doses, but the dose rate is pretty low over most of the mission (other than during high dose rate solar events where they would need radiation shielding). How much of an elevated cancer risk is it actually? Nobody is quite sure.
Edit: Correction, there is very little difference and fewer cosmic rays during the day. Source: https://arxiv.org/pdf/physics/0105005.pdf
1) Use ECC memory
2) Go underground
"One experiment measured the soft error rate at the sea level to be 5,950 failures in time (FIT = failures per billion hours) per DRAM chip. When the same test setup was moved to an underground vault, shielded by over 50 feet (15 m) of rock that effectively eliminated all cosmic rays, zero soft errors were recorded. In this test, all other causes of soft errors are too small to be measured, compared to the error rate caused by cosmic rays."
Not exactly. When I was in telco, where I had this problem was in FPGA's, we had all ECC memory and I never linked any problems to bit flips in RAM. But as I remember, the FPGA's we had were using a type of SRAM cell, but because it's not a memory module the FPGA programming could bit flip. So the product had a checksum function, that read back the program on a cycle and reset itself if the program no longer matched the checksum. So we would see 1-2 crashes / restarts per week in our FPGAs that we believe were bit flips.
We then ran an anlysis on any of these that higher than expected error rates to try and identify actually bad hardware and replace them.
I think the vendor eventually came up with a way to reprogram the FPGA without just crashing and rebooting the entire board.
If even higher levels of reliability are needed, there are rad-hard-by-design FPGA families (e.g. Xilinx Virtex 5QV). These have a special config SRAM cell that has more charge storage sites than a conventional SRAM cell. It is less area efficient than a conventional SRAM cell, but geometry of the charge storage sites ensures that a single cosmic ray can't flip the state of a majority of them. Essentially the cell can self-correct, no scrubbing required.
Would you say this quick reference is a good overview ?
But unfortunately plastic bags are not dense polyethylene. You would kind of need full blocks of solid polyethylene...
But it's quite flammable, so you might not want to use it as a building material.
Though most of them are tests in space, where I assume the thickness requirements would rule out grocery bags. I am curious how thick a layer of HDPE you would need on earth to make any notable difference.
edit: No idea why the downvotes. Faraday cages have long been thought of as a way to protect electrical devices from the electromagnetic waves which can be a result of solar flares.
I even chatted to someone from NASA’s Solar Dynamics Observatory about it in the past.
In the associated DEFCON talk, the author notes that even if you use ECC memory, it's unlikely that the DRAM in your hard drives does too.
Living under a rock.
Yeah, yeah, I'm leaving. Put the weapons away...
That would be too many cosmic rays.
And the "flipped the wrong neuron" possibility is much scarier in my mind. And probably in everyone else's brain as well.
That said, just saying "chaos theory means any change can cause anything" is a terribly weak argument for "cosmic rays cause ideas". Not to mention all the reasons why it's implausible as a significant idea generator.
Headline would be more accurate if it said "30K unidentified bugs blamed on unverifiable phenomenon each year in Japan".
NTT systematically set out to be able to distinguish these different kinds of error, and you go ahead and ignore that effort.
Well, the rust programmer and the nuclear engineer, possibly.
ML People who train on this hardware report more NaNs causing training failure than expected due to software bugs. It's extremely challenging to debug because most ML codes are very robust to small amounts of injected noise, especially gaussian independent noise (there's literature showing that introducing random numbers in training often helps training go faster).
This is a fascinating area and there aren't a lot of people who can really make forward progress in it.
A more comprehensive an interesting take on the cosmic ray phenomenon. It’s actually a global issue that is accounted for in certain electronics.
Uhhhhhhh. Neutrons? Doubt goes here. I was pretty sure charged particles and ionizing radiation are to blame for messing up your electronics.
It is much more dangerous than other types, as it readily passes most materials, yet dosage is absorbed. Absorbed neutrons cause particles to (almost always) become radioactive, which causes havoc as those new radioactive particles decay.
The other common types of radiation (alpha, beta and gamma) do not cause things to become radioactive (usually), instead they are a threat if the actually radioactive material accumulates as dust on your skin (for example).
Neutron radiation is much more dangerous because it seeds radioactivity deep in the body, so you can't clean it off by washing or waiting for the absorbed particles to be flushed back out of the body naturally within a few days.