How can researchers say that the rate of change is so much greater now when, not even counting propagation of error, the granularity of measurements pre-nineteenth century is, at best, of the same order-of-magnitude as the WHOLE of the era of modern precision measurements?
EDIT: I realize my question is inconvenient, but downvoting without meaningful response is no way to counter skepticism. This topic gets more religious as time goes on.
Ice core data is correlated with solar activity cycles:
> Those, in turn, must be put into a date range based on some form of (I assume) radiometric dating.
Google can tell you precisely how it is done. Government institutions such as NASA, NOAA or the universities (and equivalents in other countries) generally have reliable information (and the information is straight from the source since most publishing scientists work there).
EDIT: more benefit-of-the-doubt for orders-of-magnitude difference — three instead of six. For example, if we take the yearly average, that’s a sample rate in the micro-hertz. A sample rate of +-80 years is in the nano-hertz range.