These special studies are good for a scientific consumption, but they don't do much to make the results comprehensible to the general public.
In this specific case, what's missing is the bigger perspective, like this:
"If we look at the 100,000 average dead people, we'll find that 30,000 of them died from cancer. Now, we are looking at 100,000 dead low dose nuclear workers, and we have previously thought that 30,010 of these would have died from cancer, but now we've found that number to be closer to 30,020."
This really puts "2x worse" into a proper context, which is: there are low-risk occupational hazards in nuclear industry.
There's also some interesting things about the paper that set off some flags for me. I've studied radiation extensively, but not medicine fwiw.
- I'm always wary of anyone that will do something like write 2000 keV instead of 2MeV. They also are constantly saying "photon radiation" instead of "gamma," which just reads weird. Neither of these are "incorrect" though.
- They seem to not be accounting for neutrons which is quite a common source if we're talking about reactor workers or Hiroshima. The neutrons get absorbed into the atoms and then it sheds energy through gamma radiation, but a new neutron results in potentially a different chemistry in addition to the ionization energies. Actually reactors depend on this: 235U + 1n -> 236U -> 89Kr + 114Ba + 3n + 2gamma. You just don't often see the 236U discussed, but it is unstable and causes the splitting.
- They use Japanese atomic bomb survivors as comparisons but that's not great and there are many studies looking at radiation workers (both these get neutron doses). There's two big aspects here that matter. First, acute dosage is very different from chronic exposure, and atomic bomb dosages are going to be acute (not just Japanese, but also soldiers). Most workers are going to be chronic, but some may be acute which you need to account for. They don't seem to. There are also many studies on places in the world where the natural background radiation is over the 100mGy that they are saying doubles the cancer rate, but people from these places don't show increased cancer rates (this is mostly radon too, so alpha radiation, and of course this is chronic and not acute).
I don't feel qualified to invalidate the work btw, but these seemed points of note. I know there are some users here with some more expertise, especially in radiation physics, so I was hoping someone could help me understand.
There is no increased mortality in the paper. The paper only looks at risk of death from solid tumors.
I'd be surprised if the authors haven't done the calculation of total mortality and decided that there is no discernible radiation dose-dependent effect on total mortality or total QALY. In a better world, that would also be a very good paper, but it isn't in ours.
That's why they are looking for solid tumor mortality.
“Out of 100,000 living people, x will die from solid tumors in one year; out of 100,000 nuclear workers we believe x+2y will. This differs from the x+1y figure that was the previous estimation.”
The relevant measurement is called QALY, or quality adjusted life years. It is meant to be a meaningful representation of a disease burden. As an example, thyroid cancer is hardly curable, but a common remedy is to remove the thyroid and to rely on medications going forward. Somebody in this condition can have a long quality life otherwise. Pancreatic cancer, on the other way, has very low survival rate, so it cuts a lot of QALYs out of a person’s life.
It would be great if the authors would quantify their findings in QALY, as in e.g. “while we have previously believed that the low dose radiation results in a loss of x days (QALY wise), we now believe that number to be 2x days”.
If you look at the relative risk curve (the very last page of the supplementary data), most of the exposure levels contain 1 in the relative risk estimate (also please notice that for some reason the confidence interval is 90%, if choosing the customary 95% all relative risk confidence intervals would probably overlap relative risk of 1.0):
The fact that ~10mGy seems to not have 1.0 in the confidence interval is kind of bothering to me. Normal annual background radiation should be about ~3.5mGy (so 10mGy is about 3 years normal background radiation) and getting an increased solid tumor mortality risk from normal radiation levels, does suggest that something else has not quite been controlled for.
Comparing background radiation to exposure from artificial radiation sources does not sound right to me. The background radiation is absorbed relatively evenly by the body surface. However, various artificial radiation exposure is local, such as dental x-rays or CT, which, for example, will increase the thyroid cancer risk at seemingly small exposure levels compared to cosmic radiation exposure.
I wondered if the high safety standards in nuclear plants was affecting the outcome. If industrial workers aren't dying in things like construction accidents then they have to die of something else and sometimes that will be cancer. Because that graph does make it look like being employed in the nuclear industry increases cancer risk even without any radiation dose at all. It isn't though, because it looks like we're talking about ~10,000 otherwise unexplained deaths over ~300,000 people and even being pessimistic I think industrial accidents would only be <50 deaths. I wouldn't completely rule out some industry-specific effect like that even now, looking at the figure you reference.
The nature of the study make it hard to be confident, there are a lot of potential influences that could throw the conclusions. It is challenging to interpret too, table C in the supplementary data makes it look like risk drops with higher doses and I am sure I'm reading that wrong.
Thanks for linking the graph, that's kind of wild. I agree with you that the lowest datapoint seems crazy. I can think of a few explanations.
- Random bad luck.
- As you say, failing to control for something -- although, if you then treat the lowest datapoint as being effectively the default risk, this would suggest support for radiation hormesis (that people who got a bit more than background radiation actually did better.)
- Some kind of data collection artifact. Perhaps the people with the absolute lowest dose, in a radiation-worker dataset, are selected for being ones who are not getting an accurate measurement (i.e. sloppy about wearing dose badges or something), and those people genuinely do have worse outcomes.
Interesting… My mother was a flight attendant for 35 years and she developed breast cancer later in life. She was told it was probably environmental not genetic based on testing.
You do get additional radiation exposure while flying. Wonder if we should also study the effects of higher radiation exposure in the airline industry.
Sounds like there is enough to evidence to perform a longitudinal cancer study across air transport workers to better understand the potentially increased cancer risk.
EDIT: Is hazard pay warranted? Also, perhaps airlines should have to provide estimated cumulative exposure to employees based on their flight trip logs. Estimation should be straightforward based on flight track logs combined with cosmic radiation satellite data.
> Estimation should be straightforward based on flight track logs combined with cosmic radiation satellite data.
Just take a cheap dosimeter on your next flight. I have an Atomfast that I often make recordings of exposure with for amusement, as well as using it when looking at neat rocks outside (some areas I like to hike in have surprisingly high radiation levels due to there being some of uranium in the rocks).
If anything they'd do the opposite so that the public doesn't get "the wrong idea" something dangerous is up. Just look at how Homeland Security made it illegal to measure how much radiation the post-9/11 security screening devices put out, forbade medical physicists from testing them, forbade employees working near the scanners from obtaining and wearing their own protection, and fired workers who wore their own dosimetry tags.
I was referring to the security officers who stand next to the machines everyday. They were forbidden from measuring anything related to occupational exposure. At one point the TSA found itself caught in so many lies about their supposed "testing" that they ran a very public PR-style limited study. Coincidentally, nowadays the backscatter scanners have been quietly replaced with millimeter wave scanners.
Anyway the point of my comment wasn't to re-litigate this. It's more that the original comment struck me as "can't the fox just guard the henhouse?"
An officer traditionally has been a person in military with a commission (or NCO), or police. TSA are government employees, with self-created pretend titles and fake badges. TSA can't detain or arrest and if scan tunnel shows possible weapon the TSA aren't allowed to touch it and the line cop does that.
Unlike in medical facilities, the radiation comes from multiple directions. Also, the crew moves around the plane. The vests would have to cover an impractically large area.
Also, when radiation is blocked by matter, secondary radiation is generated. Since the composition of cosmic radiation is for sure different than the one found in other occupational settings, lead vests might or might not have the same effect. Also, the aluminium skin of the aircraft already has a similar effect. There might not be that much to be done practically.
It would be interesting though to let flight crew wear radiation badges.
Since we're on this topic, there also seems to be enough evidence to suggest you should wear sun screen when flying.
> Pilots and cabin crew have approximately twice the incidence of melanoma compared with the general population. Further research on mechanisms and optimal occupational protection is needed.
Sunscreen blocks UV, not cosmic rays, which is the radiation for which jetliner passengers receive an unusually high dose. The glass in the windows probably blocks 75% of UV, and UV exposure is associated with other types of skin cancer, not melanoma (except in cases where one has a history of bad sunburns). In fact, in this review, melanoma is said to be inversely correlated with occupational exposure to UV:
Sunscreen doesn't block cosmic rays, those are high energy atomic nucleus (that are normally filtered out by the atmosphere), not high energy photons/light (that end up on beaches). Did you previously find scholarly studies that suggest wearing sunscreen helps with melanoma, or did you just see high melanoma incidence rate in frequent flyers, and assumed sunscreen will help?
> My mother was a flight attendant for 35 years and she developed breast cancer later in life.
From what I recall from the Chernobyl studies, the risk of being ill with cancer during the lifetime is 39% for females and 41% for males. The occupational hazards may get the individual risk up a little, but overall cancer is sadly a very frequent occurrence.
Keep in mind lifetime risk includes the risk of getting it in old age, where it might be a case of if you didn't die of cancer, you'd just die of some other ailment (heart attack, dementia, etc) instead.
Evolution only optimized our parts to last for so long, and once you get past that age, everything kind of starts failing simultaneously.
This is in part because too long life may in fact hurt the group. Of course we don’t want to look at that because we are so hell bent on extending life, even talk about immortality.
Evolution works on the genetic level, not the population level. While some altruistic strategies might be selected for (e.g. kin selection), in general evolution doesn't care if something "hurts the group."
In nature, dying of old age is incredibly rare as you are much more likely to die from predation or injury. Even if you suffered no ill effects from aging, ie you were biologically immortal, passing on your genes successfully would still require you to have offspring reasonably early in life. There is only selection pressure on your genes up to the point where you have successfully passed on your genes. It makes sense for evolution to keep you alive while raising your kids, and even while helping your offspring raise their kids (see the Grandmother Effect [0]), but eventually the marginal utility of your support drops so low, and the odds you're already dead rises so high, that there is simply no selective pressure. As an illustrative example, you could have a gene that makes you twice as strong as you were at 25, but if it only activates after you're 150, you and evolution would never know.
On top of this, many genes have tradeoffs. If there is any mutation that gives benefits early in life but is harmful later in life, evolution will tend to select for it, especially versus the reverse. Similarly, traits that only provide a benefit for a finite period which is nevertheless long compared to natural lifespan can be favored over traits that work indefinitely (eg many mammals are born with a few sets of teeth and have no way of repairing them). This leads into more general accumulation-of-damage theories of aging.
We see many species in nature, particularly those unlikely to undergo predation and at low risk of injury, with substantially longer or even indefinite lifespans. In the lab, we can also breed organisms like fruit flies to have substantially longer than natural lifespans.
There is no evidence that living longer is harmful to others in hunter gatherer societies of the sort we evolved to live in, and certainly no evidence that such harm would still be present in modern, industrial societies.
This is quite shortsighted of you to say. If your group is less likely to survive, you as a part of it are also less likely to survive and vice versa. Genetics are not isolated from you, or your group, or your environment, they’re not isolated from anything. Evolution is a function of the totality of the system in which you exist. And to prove my point, genetics clearly work on the level of an organism, which is nothing but a group of cells holding the genes you’re talking about. Why are genetics working on the level of the organism, a group of cells, not just on the level of the cells? It makes no sense to have it both ways and claim the evolution of genes goes beyond cells but stops at the surface of your skin. Why did we evolve empathy, cooperation, language? Because we survive better as a group when we can synchronize together in advanced ways. You think none of this requires genetic alteration to exist? Why not teach your cat to talk then?
You're talking about an optimization strategy that's a few hundred thousand years old though. (And might not even be an optimization strategy at all) Its applicability to modern life seems highly dubious considering the massive cultural changes we have been going through.
You've put it clearly and simply better than I managed in my reply. The data is just too noisy we will need extraordinary large datasets to even beging to make comments.
Note: that is about chances of getting cancer, not dying from cancer. Many people beat cancer. Many cancer types have good survival rates. Thyroid cancer (Chernobyl had much of these) sucks, but many people who got it had their thyroid removed and had to rely on drugs for the rest of their life ... this is obviously not ideal but much better than dying I guess?
This happened to me at 18 and my cat when he was 16. Now he's turning 19 and I'm turning 43. He and I can confirm it is much better than dying. Mine was sliced and diced and I'm stuck taking a pill every day. For his they used a radioactive pellet to kill the extra tissue and he doesn't need to take anything.
This is happening much vastly more frequently with cats and some folks suspect a now banned chemical is to blame.
- Take Methimazole. It will partially control thyroid levels but not well so your cat still dies but later than option one.
- A vet implants a small radioactive pellet destroying the problematic tissue. Most cats will live as long as they otherwise would have and few have to take thyroid hormone.
Wow, there is a similar comment above[0] suggesting radiation badges for flight crew. I was going to ask if typical badges are sensitive enough to register the expected dose from flight.
If I understand correctly, you're saying a single commercial flight is enough to trip the dosimeter?
Not necessarily, but if it gets xrayed going through security then when they heat it up so that it luminesces, it'll be so bright that it messes up their carefully calibrated detector. This is expensive to fix and the dosimetry people get really mad at you.
Source: wore one of these daily for years and would get regular reminders not to bring them on flights or anywhere else they would have to go through an xray machine (eg some federal buildings)
You do get higher radiation flying - but you are also surrounded by off gasing synthetic materials and receive frequent exposure to engine exhaust and get disrupted sleep due to hours and changing locations.
So in other words, there’s a lot of potential causes.
Is it because of elevated cosmic ray exposure due to the high altitude? Or plastic in the inflight meals? Or the crazy schedules? Or is there even a statistically significant effect over time?
Knowing any of these things is actually quite hard, and would require a lot of detailed data just to identify a potential pattern.
To add to the data confusion lol, my grandmother basically never left her home farm. Never drank. Ate meat they slaughtered themselves. Sweets were a rare treat on holidays. You could barely get her to take a car trip because it scared her. Other than her no one is aware to have ever had cancer in our families history.
The various US airlines, who partner with health insurance, certainly have access to this data. Cancer frequency, time in-flight, time in stratosphere, etc. Frequency of these things (or a subset) likely affects the rates.
Note: the title is misleading and contrary to the authors' conclusion. Where are the moderators?
"The association between cumulative dose, lagged 10 years, and solid cancer mortality was reasonably well described by a linear model (fig 1); inclusion of a parameter describing the linear association between cumulative dose and solid cancer contributed substantially to model goodness of fit (supplementary table B). The addition of a parameter for the square of cumulative dose led to only a modest improvement in model goodness of fit compared with the linear model (likelihood ratio test =2.51, df=1; P=0.11), suggesting some downward curvature (that is, a negative estimated coefficient for the quadratic term). The addition of a parameter for an exponential term in the model led to a modest improvement in model goodness of fit for a linear-exponential model compared with the linear model (likelihood ratio test =3.17, df=1; P=0.08), again suggesting some downward curvature. To assess the trend over the lower cumulative dose range, we estimated associations between cumulative dose and solid cancer mortality over restricted ranges of 0-400 mGy cumulative dose (excess relative rate 0.63 (0.34 to 0.92) per Gy), 0-200 mGy cumulative dose (0.97 (0.55 to 1.39) per Gy), 0-100 mGy cumulative dose (1.12 (0.45 to 1.80) per Gy), 0-50 mGy cumulative dose (1.38 (0.20 to 2.60) per Gy), and 0-20 mGy cumulative dose (1.30 (−1.33 to 4.06) per Gy) (supplementary table C). Over the restricted range of 0-200 mGy cumulative dose, the association between cumulative dose and solid cancer mortality was well described by a linear model, and the addition of a parameter for the square of cumulative dose led to minimal improvement in model goodness of fit compared with the linear model (likelihood ratio test=0.54, df=1; P=0.46)."
They say over and over that linear model (with relative risk = 1 at 0 dose) is a good fit! this is LNT. They say the improvements of adding the other fit are marginal. The error bounds here are large, and the deviation is small but apparently systematic or real below 200mGy. The authors, as far as I can tell, are just reporting the figures without claiming LNT is wrong.
I'm not a statistician, but I think there's a bit in your excerpt that is actually a concerning display of poor statistical literacy.
If you're fitting a function which grows asymptotically (i.e. is monotonically increasing at least past a certain point), the best (polynomial) fit absolutely cannot have a negative quadratic as the leading term. If your model gives one, it is 100% guaranteed to be an artifact. Treating it as "suggesting some downward curvature" is a pretty bad misunderstanding.
If you have doubts about this, consider what would happen if we added datapoints at higher doses. Every single datapoint we add to the right side of the graph will make the fit of a negative quadratic significantly worse. Ultimately, if you continue the graph indefinitely to the right, the fit of a negative quadratic is guaranteed to be infinitely bad. Any hint to the contrary is inherently an artifact of the limited dataset.
(It may well be the case that, under certain conditions with a range-restricted dataset like this, such a finding might indeed be more likely if the true function has some downward curvature. But that's not statistics, it's voodoo. All the associated statistical parameters, p-value, likelihood ratio, etc., are absolutely meaningless nonsense.)
I think that in this case it is definitely a case of analytic-cotorsion.
There is _heated_ and mostly opinionated debate about the low-dose regime. Briefly some people believe low-doses of radiation have a threshold (that one could get a certain amount before having excess cancer risk), there is "no-threshold" meaning relative risk = 1 and dose = 0, and there are people who say that some radiation is good for us because it stimulates our repair mechanisms and kills weak cells.
All of this heated debate brings us to the currently misunderstood findings, and is why the author's say over-and-over that the linear model is a good fit.
People, specifically the regulators and the nuclear industry, make a _bunch_ of health decisions on excess risk based on Linear-No-Threshold (LNT) models because they believe it is conservative.
The main issue here is that in order to get good statistics, you have to irradiate people, and the two best datasets prior to today are Hiroshima survivors and the small number of low-dose exposures. People are torturing this analysis to resolve the debate, which in my opinion only feeds it.
Um, I don't think the HN title is contrary... Quoting:
> What this study adds
> The results of an updated study of nuclear workers in France, the UK, and the US suggest a linear increase in the relative rate of cancer with increasing exposure to radiation
> Some evidence suggested a steeper slope for the dose-response association at lower doses than over the full dose range
> The risk per unit of radiation dose for solid cancer was larger in analyses restricted to the low dose range (0-100 mGy) and to workers hired in the more recent years of operations
The HN title is "Low dose radiation cancer 2x worse than predicted" than LNT for some unspecified region of the curve... they do not claim LNT is wrong, they do not even present their more sophistocated fits and they do not quote the 2x figure. It is misleading. The paper presents an LNT fit
If you want to say "there is a bias above linear-no-threshold in the region less than 200mGy" you are correct, but also say that the bias has large error bars associated with it and that the data may or may not fit that trend.
Unless you have p-values to back up the 2x claim, it shouldn't be in the title.
LNT is a funny model: it's extremely simple yet people misunderstand it all the time. Like, it's basically how you would model loss of money would affect your finance. You lose ten cents, you're out of ten cents. You lose ten million dollars, you're out of ten million dollars. Totally proportional.
Now the funny thing is, nobody sane would say "Losing a quarter that fell out of your pocket is losing money, gambling away a million dollars is also losing money! There's no threshold beyond which losing money becomes not losing money, therefore having a loose pocket is as bad as gambling addiction!"
Yet change the subject from dollars to radiation, and so many people run around saying "No threshold! That means there's no safe dose! Living next to a working nuclear reactor is as bad as walking into the ruins of Chernobyl!"
The large INWORKS cohort study of nuclear workers finds that in the low dose range (0-100 mGy, or 0-10 rad) the solid cancer mortality risk per unit exposure is twice that predicted by the LNT model. Radiation appears to become more dangerous at low doses over prolonged periods, the opposite of the assertion of some nuclear energy advocates.
As I have pointed out, the radioactivity in uranium ore is higher than the radioactivity in coal, when comparing the amount of ore and the amount of coal that needs to be mined to produce a given amount of energy in today's power plants.
They liberate decay products into the environment when the uranium ore is processed.
Coal fired plants aren't putting ash into the atmosphere either these days, in the US. Bottom ash never went into the air, and fly ash is efficiently trapped in electrostatic precipitators and bag houses.
Which of the two plausible interpretations is the strongest? Please enlighten me esteemed HN guideline expert. I am asking because I don't know the answer.
"Low-dose radiation from A-bombs elongated lifespan and reduced cancer mortality relative to un-irradiated individual"
Does not equal
"A-bombs are good for your health"
FYI, a small amount of DNA damage is thought to stimulate DNA repair pathways, which is protective. What's bad for you is massive amounts of damage that your body can't keep up with.
You have no idea what you're talking about, and the fact that you're so confident over your obscenely wrong conclusions makes you a dangerous person. How about going and exposing yourself to a massive dose of radiation if you're so sure you're right
I'm not seeing conclusions, let alone confident conclusions. An interesting paper was linked to that apparently showed some survival benefit from A-bomb survivors. By all means, challenge the paper, but it's certainly interesting. It might not be radiation exposure that led to the increased survival rates but it could plausible be.
> exposing yourself to a massive dose of radiation if you're so sure you're right
Are you aware that radiation is one of the treatments for cancer?
Are you aware that neoplastic growths are constantly popping up and being checked by your immune system, and that "cancer" illness only happens when these growths are not controlled by the immune system?
Are you aware that radiation works by destroying cells which are dividing rapidly through DNA damage?
Given these facts, is it really surprising that there is a specific amount of radiation (but not more) would lead to reduced cancer mortality relative to un-irradiated individuals?
Yeah the title here really overstates the strength of the conclusions.
Didn't read every table in the paper but they clearly state that a linear fit was parsimonious with the data, and adding quadratic and exponential terms only modestly improved the performance but neither even had a p-value of less than 0.05 (I am aware that overreliance on p-values is a bit of an issue, but nonetheless).
Its also easy to see from the graph in the paper that the linear fit goes through all the error bars.
* edit / Disclaimer: only skim-read the article, I should be working right now
I understood LNT is already considered to be too pessimistic at low dose. Would this mean it is in fact optimistic? And low dose radiation is actually quite bad?
I think the main takeaway is we probably should not put too much stock in a linear-time-invariant-cumulative model of radiation mapping to cancer risk. Sure it's the easiest to calculate, but that's the streetlamp search fallacy.
It's far more likely that the shape of the radiation dose curve matters quite a bit on how the body responds. E.g. the difference between 10 discrete 1 Gy events, vs one 10 Gy event, vs 10 Gy absorbed over a year.
This is something I don't understand about sunburn - If I'm exposed to 15 minutes straight of sunlight, I will burn, but in 2-5 minute chunks spread out over several hours, I can tolerate much more.
Is there an equation for this? Like a small coefficient with a high power? 0.1*Te^4 ?
Or am I wrong about how sunburn works?
This one just shows time as a straight factor, no exponent:
I'm guessing it might have something to do with the "impulse curve" of the radiation. Atom bomb would look like a spike with exponential decay. This study looks at constant low-grade exposure. It might be that a single main radiation event (if it doesn't acutely kill or sicken you too bad) causes an inflammatory response and increased mutation detection, while constant low-grade provides many more opportunities for errors to slip through the cracks.
This is one study. There have been endless studies showing the negative effects of people near the Trinity test site. In fact, the government still pays out money to victims. This is a nonsense study that was never reproduced. Moreover, Japan provided extraordinary health care to victims that survived the bombings.
I'm as pro-nuclear energy as you can get, but trying to hand wave away the dangers of radiation exposure is actively detrimental to the cause.
Those things are not in disagreement with each other. It could just be that the bomb was a selection event which selected people who have the best ability to repair DNA, which protects from cancer-causing mutations in general.
Plus, the specific amount of radiation that is protective vs. cancer would probably only exist at a specific distance from the detonation site.
So, even if the study is 100% accurate and reproducible, in no way is it an argument for dropping more bombs.
You can't throw away data because you think someone could theoretically use it to support a bad idea.
> ProPublic reports that researchers have estimated that widespread use of backscatter x-rays will cause between 6 and 100 cases of cancer each year among fliers.
> The TSA maintains the devices are within federal safety limits.
I've always "opted out" of backscatter machines. In about half the cases airport staff were exasperated with me. In Australia they interrogated me for "the specific reason" I requested a manual pat-down instead. I made one up.
But my real reason was that I fly a lot and I don't need to be blasted with millimeter wave x-rays for no good reason.
I've only ever seen mmWave full body scanners, TSA began phasing out backscatter X-rays in 2012 [1] and according to this [2] backscatter X-ray is no longer used in US airports. Maybe different in other countries, I don't know.
mmWave is radio frequency not X-ray and similar frequencies are used in 5G.
I was careful to ensure it was horizontal (y = 308 px, thanks GIMP). It is apparent that cancer rates at 60, 120, 180 mGy were higher than at 240 mGy. This may raise some questions about how to interpret the data.
I think that's the point, the paper is saying "a linear model was a good fit" when P=0.11. It's not at all a strong fit, compared to e.g. physics where p=1e-5 is desired. You'd think with 300k data points the results would at least be "statistically significant", p<0.05. The only data they present is averages over countries so I wouldn't be surprised if there was a huge effect of which site the worker was at, and their analysis deliberately ignores this effect by averaging, instead of including it as an explicit variable that explains most of the variation.
R coefficient is how you assess fit to model. P is for significance you might use it to assess which model is better (typically using an interaction term).
That 0.11 shows that the alternative model is not significantly better than the model presented.
That might be confusing because typically everyone just thinks p<0.05 is good.
I'm extremely suspicious of xray machines at the dentist and the doctors office. I refuse any xray that isnt absolutely vital. Medical professionals do them as a box checking exercise to prevent mal practice law suits, but they are largely useless. They do catch issues, but if you are young and healthy, the radiation risk is definitely there.
Another reason to be more injury avoidant than most young people usually are! I have lots of friends who sometimes have to go to the doctor for a sprain or hairline fracture or something, often from sports.
It just doesn't seem worth it to do stuff like volleyball or any high impact sport.
Fractures are very rare in volleyball. High impact sports such as weightlifting are critical for bone health and building skeletal muscle. Lack of bone density and insufficient muscle are certainly more dangerous than an occasional x-ray.
Lots of people seem to make it to old age, still being almost as healthy and active as an average young person, sans any broken bones, without lifting much of anything heavy or playing any sports, they seem to just do causal swimming or hiking or tai chi or something.
And there are gym rats who have a life of all kinds of random aches and pains despite being very strong, then have heart attacks, so I'm super confused on what you're actually supposed to do, but it definitely seems like there's a limit to how much people really need for close to optimal health.
The science seems pretty clear that having muscle is better for your health, but it seems really hard to figure out how much is actually needed, and whether someone who's already reasonably in shape from walking and doing projects actually needs any lifting outside the "near zero injury chance and you probably won't be sore tomorrow" range for maximum health issues prevention.
There is certainly a genetic component to bone health, but on average most people aren't able to build up enough bone mineral density and skeletal muscle without lifting heavy weights. You're dreaming if you believe you can get in shape by walking and doing some casual projects (heavy manual labor is another story). Beyond a certain age it becomes essentially impossible to build those up, and then you start gradually declining year by year. Do you want to spend the last years of your life crippled because you fell down the stairs and broke a hip?
In order to figure out how much muscle is needed, decide what you want to be able to do when you're old and then work backwards from there.
Cancer mortality after low dose exposure to ionising radiation in workers in France, the United Kingdom, and the United States (INWORKS): cohort study
It's 150 characters, so had to be edited down somehow to fit the 80 character limit. This was not a good edit.
I would likely have edited it down to this:
International study: Cancer mortality after low dose work radiation exposure
Beyond what others have said here, I found the current title -- Low dose radiation cancer 2x worse than predicted by LNT model -- extremely confusing because I thought it was talking about low dose radiation cancer treatments being ineffective.
I've been fascinated by the fact that electricians have a 2x greater risk of brain cancer than the general public. There are probably tons of environmental effects we're discounting / not modelling.
One story we were told in high school physics was that at one point there was a worry about power lines being linked with leukaemia cases. And yes, there was a statistical correlation.
Turned out to be the weedkiller they were using to keep down the plants at the base of the pylons.
Lots of professions have significantly greater brain cancer risk than the general public like roofers, sheet metal workers, most people that deal with plastic manufacturing or cleaning solvents, and… waitresses [1].
From my experience everyone uses it (perhaps the emergency setting is more intense).
But my point on low radiation is that you are not behind a lead wall. Below knees, arms, head are exposed. Depending on the design the seams between sections of the protective gear will also leak. And the protective gear gets tested but only periodically not before every use.
All in all everything seems to align with some probability of prolongued low radiation exposure much like nuclear power plant staff.
I really dislike the way various "skeptic" communities tend to rally around "X thing is safe/maybe actually good". Like global warming is another example.
Science already tends to be very conservative and build a lot of bias towards the null hypothesis. And then amateur skeptics take that and add even further bias towards null/maybe-actually-good. Maybe global warming will make the plants grow better (no), maybe it won't do anything at all (no)!
These very faint studies around things like radiation hormesis get blown into loud dissent that "maybe radiation actually good?". But god forbid you ask them to wear a mask to prevent the spread of a deadly disease, they want to see long-term studies in triplicate with massive p-factors.
The hilarious thing is that hormesis, taken seriously, would make one think the risk curve has a negative second derivative, as this study suggests it does. So a hormesis fan should have been suggesting LNT was underestimating the danger of low dose radiation.
No it wouldn't. The hormesis hypothesis proposes that the body's repair mechanism from a low dose of radiation yields benefits that outweigh the damage of the radiation. Eventually those internal repair mechanisms are overwhelmed, leading to a positive second derivative.
There is no specific evidence for hormesis (i.e. evidence lacking good alternative explanations) nor are there reasonable proposed mechanisms by which this would happen. But we should be clear about what it says.
No, what the hormesis idea suggests is that radiation induces an increase in repair mechanisms. So, add radiation, you boost defense mechanisms, and more radiation then has less effect. The curve is bent downward, the second derivative is negative.
All of the graphs in the literature suggest a positive second derivative. Why do you say negative? AIUI the model is that hormesis kicks in even for extremely small amounts of radiation, but has a limited effect and is overwhelmed by the negative effects of larger amounts.
Radiation hormesis isn’t a fringe theory though[1]. Now it may not be correct I can’t speak to that. I do know though that many reputable scientists consider it a possibility. The body produces useful adaptations to pretty much every other stressor in sufficiently low doses so why would radiation be different, especially given that life evolved in the presence thereof.
Edit: I wouldn’t be shocked though if the typical background dose is close to the maximum hormetic exposure and all but the smallest additional dosage starts the slide down the curve.
So what? I'm not really seeing the relation of masks to radiation, those are two orthogonal beliefs and bringing them up together is something of an ad hominem on a strawman.
> But god forbid you ask them to wear a mask to prevent the spread of a deadly disease, they want to see long-term studies in triplicate with massive p-factors.
That's a strange example to use to try to back up your argument, especially considering the data and observations we now have available to us.
It's scientifically established at this point that those who questioned masking were absolutely correct: masking did not have a significant preventative, nor even mitigative, impact on case counts.
What might be among some of the largest and most extensive scientific experiments ever performed confirm this.
For example, tens of millions of people across Canada were forced to mask for well over a year in most regions, and two straight years in Toronto (the fourth-most populous city in North America).
Yet, despite masking being universal in public for an extended duration, there were multiple significant increases and decreases in case counts over time in such regions.
Had masking been effective at preventing, or even just mitigating, the spread, then those observed case count fluctuations would not have happened to begin with.
Even scientific observation as basic as watching masked individuals outdoors in colder autumn and winter temperatures showed why masking was ineffective: notable and visible clouds of water vapour would be expelled with each breath the masked individuals took, despite the masks supposedly limiting or preventing that from happening.
For most of the widely-used masks, and even respirators as commonly worn, the vapour would often be concentrated as it exited the mask and was directed through gaps where it contacted the wearer's face.
In busier urban settings, this would effectively expose those in the vicinity to a concentrated dose of another individual's breath. Masks that didn't prevent egress of such vapour also didn't prevent ingress.
The same effect was happening indoors, and during warmer conditions, but just not as easily observable as during colder temperatures.
So, it's scientifically established at this point that there is no epidemiological nor physical basis to support masking.
Maybe there's a better example you could have found to support your argument.
My parents have owned homes and grown many food plants in many cold coastal climates near water (let's say Oregon) and in hot valley climates (ca central valley) where there is definitely less water.
Things grow much better in the warmer area, but you have to water like crazy.
I personally do wonder if we couldn't grow food where the water is, if only those areas weren't as cold as they currently are. Obviously makes you wonder if the rainfall in those areas would still stay.
Cold northern climates are cold because they get less sunlight which then translates into lower temperatures. Simply increasing temperature wouldn’t provide more sunlight in the spring and fall which means the growing seasons don’t actually expand nearly as much as you might think.
Also, we can grow stuff as far north as Canada while the equator runs though South America. People used to Mercator projections get a wildly incorrect view of what earth’s surface looks like, but in reality the loss of farmland wildly outweighs any possible gains.
Yeah it’s not just available sunlight, though that is a significant one.
Another factor in plant growth is that chemical reactions happen “faster” the greater the temperature. While temperature doesn’t have as an extreme direct effect on plants as it does microbes, the effect is still absolutely there.
I love how technologists/programmmers/enthusiasts will vehemently argue that EMFs, iPhones, Airpods, and all of the other countless devices constantly bombarding us with "non-ionizing radiation" is "completely and utterly harmless". Meanwhile even the most ancient patriarchal doctors with obsolete worldviews will suggest aspiring fathers may want to avoid having their phones in their pockets 24/7.
No one knows jack shit, as we're finally discovering with PFAS, forever chemicals, etc.
> will suggest aspiring fathers may want to avoid having their phones in their pockets
That's because of the heat. We know for sure that higher temperatures temporarily lower sperm production. Those "ancient patriarchal doctors with obsolete worldviews" actually know a few things.
In this specific case, what's missing is the bigger perspective, like this:
"If we look at the 100,000 average dead people, we'll find that 30,000 of them died from cancer. Now, we are looking at 100,000 dead low dose nuclear workers, and we have previously thought that 30,010 of these would have died from cancer, but now we've found that number to be closer to 30,020."
This really puts "2x worse" into a proper context, which is: there are low-risk occupational hazards in nuclear industry.