Why do publications refuse to post links to (or at least citations of) the original scientific articles? I mean, these links are the result of five minutes of googling; which means the reporter communicated with the authors themselves but didn't bother to ask for a hyperlink or an emailed citation.
The mind boggles.
(Another 30 seconds of googling.)
I don't mind that journalists don't publish your "30 seconds of Googling". Doing so would suggest that top Google hits are fact.
It uses an extremely suspicious technique to test the significance of their finding. I doubt this paper will pass review without substantial changes.
edit: The one you posted is 2 years old, but tries to explain the same effect with a different cause.
It's immediately suspicious that the authors introduce a new type of "test" for whatever spectra are being measured. When that happens, the authors need to rigorously demonstrate its validity. That is not present in this preprint.
The thing that needs to be demonstrated is that their method generates a reasonable space of outcomes. It's not clear to me that's true for their method.
Here are my three best guesses:
1. The reporters think their readers are idiots who aren't capable of understanding anything beyond what the reporters spoonfeed them.
2. The reporters want to reduce the chances that they will be caught in whatever mistakes they make.
3. Adequate citations in the pre-WWW world looked intimidating to the uninitiated.
The signal appears seriously weak relative to the noise...
As an example, radon concentrations are known to ﬂuctuate seasonally, as has been noted in Ref. , and it was suggested that the decay of 222 Rn could lead to a seasonally dependent charge distribution on the experimental apparatus. However, this eﬀect is extremely small given the low counting rates that typically arise from radon background , and in any case, the PTB data shown in Fig. 3 were corrected for background.
In the film, solar flares and sunspot activity are blamed for altering radioactivity levels in the earth's core, causing it to rise and the earth's crust to melt, leading to worldwide, CGI-friendly apocalypse.
The thing is, in 2009, when 2012 was filmed, nobody knew there was any linkage at all between sunspots and radioactivity. We thought the rate of decay was constant, and the science behind the film was bullshit. These guys hadn't published yet -- correct me if I'm wrong.
Now it's still obviously bullshit -- the change caused by sunspot activity is not nearly big enough to affect the planet's temperature materially -- but there is a linkage.
Which is, y'know, weird.
There isn't, follow up has already been done to disprove this using the Cassini space probe I believe.
But I bet the ballpark of neutrino's the sun has put out and will put out stays relatively constant inside of some rotational variance. The output behaving like a sin wave doesn't change things much, but if it's on an incline or decline, that has a huge impact on carbon dating.
It seems exciting that we have an observable cause and effect and we don't understand why. That seems like it could be a good treasure chest of knowledge right there!
If we can measure variance inside of 33 days, and that's extrapolated over
a few millions of years, that could skew the numbers a bit.
that has a huge impact on carbon dating.
* we measure differences of 1/1000th in decay rates (this is reasonable given the differences the article is based on)
* the average variation is 1/1000th, otherwise it would have been detected earlier.
* the halflife of C14 is 5000 years (to make calculations easy)
* we are determining the age of something 20000 years old (carbon dating is only used for ages in that order of magnitude).
* the decay formula is y = x(1/2)^(t/tau), where y is the amount of C14 you will measure after t years, tau is the halflife in years and x is the initial amount of C14 we obtained via other means.
We measure an amount of C14 that's 2^4 times smaller than current amounts and conclude that the amount of C14 was halved 4 times. If the decay rate was actually 1/1000th larger on average, then we should have conclude that 2^(-4) = 1/2^(t/5005) or t = 20020 years. In other words: the error is as large as the relative seasonal variation, which is quite small already and probably falls well within the regular error bars.
*x* is the initial amount of C14 we obtained via other means.
But you could add much wider error bars to carbon dating and it still wouldn't support creationism.
Maybe something 20 million years old is just 7 million. Fine. It's not six thousand.
Really, creationists should be better suited applying their ultra-skepticism at the much larger gaps in evidence and analysis found in their own ideas. The thing about people who value science is that whenever data presents a challenge we have to account for it, unlike the idiot creationists who only embrace data when it agrees with their already pre-conceived ideas.
Since this effect sounds like it's a small perturbation on top of the main model (a constant decay rate), I'd expect the final answer (the calculated age) to be about the same.
That calibration would catch any possible variation.
Incidentally did you know that C-14 dating no longer works? The reason is that we've released a lot of carbon from fossil fuels that is very low in C-14. As a result fresh organic matter today has lower C-14 levels than organic matter from 100 years ago. Odd, but true. (And don't think that the Creationists haven't been all over that fact.)
C-14 dating isn't used for dating recently-dead things, but not for the reason you suggest. The real reason is that the half-life of C-14 is about 5730 years. Not much decay happens in 200 years, and the little that does happen is difficult to sort out from measurement errors.
Other techniques are used for dating things that died recently. Watch CSI for some clues :D
This is all from what I remember from a conversation I had with someone involved with the project about 10 years ago. They got a lot of funding and a lot of interest back in the late 90's when they were just starting. I thought some of their research was interesting, but there was so much propaganda mixed in that it turned my stomach, so I stopped following it.
This seems to be another case where our standard laws of physics break down at massively high temperatures and pressures.
And of course, radioactive dating is one tool among many. The distribution of soil and rock on the Earth is almost certainly the result of billions of years of erosion, earth shifts, and climate changes.
It’s quite funny actually, if we didn’t have radioactive dating it would make a lot of sense to say that evolution itself is one of the best pieces of evidence for an old earth. It is supported by so much evidence independent of any dating methods that any discovery which changes how much we believe we can trust radioactive dating is practically meaningless. (The obvious exception are results which would indicate that earth is orders of magnitude younger than radioactive dating makes it seem to be. But, no surprise, this is not that.)
It has nothing to do with chemistry and everything with physics. If the author gets wrong this simple distinction, what else did he get wrong?
Her Nobel Prize in Chemistry was for her discovery of new elements Radium and Polonium.