
NASA releases detailed global climate change projections - RutZap
http://climate.nasa.gov/news/2293/
======
julienchastang
Several comments in this thread mention visualization. These data can be
easily visualized with the Unidata IDV [1] (same ppl that make THREDDS). For
those interested, go to the IDV dashboard, "Data Choosers" tab and enter

[http://dataserver3.nccs.nasa.gov/thredds/catalog/bypass/NEX-...](http://dataserver3.nccs.nasa.gov/thredds/catalog/bypass/NEX-
GDDP/catalog.xml)

for the catalog. At that point you can browse the dataset, add the data as a
data source, subset it (serverside), and visualize it. I and others have made
a bunch of videos of how to use the IDV[2].

[1]
[https://www.unidata.ucar.edu/software/IDV/](https://www.unidata.ucar.edu/software/IDV/)

[2] [http://goo.gl/n0Frpb](http://goo.gl/n0Frpb)

~~~
bbuchalter
Trying to follow along here, but having a bit of trouble.

1\. Visited
[https://www.unidata.ucar.edu/software/idv/webstart/IDV/](https://www.unidata.ucar.edu/software/idv/webstart/IDV/)

2\. Got Unidata IDV running locally

3\. I was then able to add the data source via URL and saw two options in
"Field Selector": "Image Collection" and "Omni Control".

I've tried selecting either field and clicking "Create Display", but nothing
ever appears in the "Displays" tab. Would welcome further advice or more
specific reference to directions among the 16 videos in your YouTube playlist.
Thank you!

This may be related to some of the errors[1] that seem to occur on startup of
IDV.

[1][https://gist.github.com/bbuchalter/dda50df626e7c4baf501](https://gist.github.com/bbuchalter/dda50df626e7c4baf501)

~~~
bbuchalter
I found that downloading IDV at
[https://www.unidata.ucar.edu/downloads/idv/current/index.jsp](https://www.unidata.ucar.edu/downloads/idv/current/index.jsp)
and not using Webstart resolved the startup errors I reported above and now I
see the Map View. However, I'm still not able to get any data to render on the
map. Would love help still!

~~~
julienchastang
@bbuchalter Thanks for registering and downloading. We like it when our users
register because we have to report usage metrics back to our NSF sponsors. Are
you able to load the catalog (the .xml not .html suffix for the catalog URI)
in the IDV dashboard, Data Choosers tab, and see the NCML files available at
that resource?

~~~
bbuchalter
@julienchastang thanks for your reply. I don't believe I'm able to see the
NCML files. I've put some screenshots together here in sequence:
[http://imgur.com/a/4oI3q#0](http://imgur.com/a/4oI3q#0)

1\. Add the catalog.xml in the Data Choosers

2\. Select Image Collection in the Field Selector and click Create Display

3\. The error I recieve when I click create display: "Unknown XML
root:catalog"

4\. The subsequent Image Collection screen which I'm not sure how to use.

Thank you again for continuing to engage with me on this.

~~~
julienchastang
@bbuchalter WRT the screenshot, you are in the URL node where you should
instead be in the Catalog node.

~~~
bbuchalter
That worked! Thank you!

------
tonec
The entire dataset is 12TB because it contains _daily_ global 25x25km maps for
all years between 1950-2099 probably for a bunch of different variables.

You don't need to download the entire thing to get the gist of it, although it
probably would have helped if they also delivered a digest version with for
example yearly or 5-yearly averages.

Edit: There is also a THREDDS data catalog present which allows you to extract
slices or subsets of the entire data. Other than that it's missing a nice
visualization tool I'd say its a pretty reasonable way of making such a huge
amount of data available.

~~~
fixermark
If you could boil it down for the busier among us: on a scale of "screwed" to
"really screwed", how screwed does it say we are? ;)

~~~
briandear
Really? You actually believe the hysteria?

~~~
fixermark
Define 'hysteria.'

I believe humanity will survive, but at a significant cost to lifestyle (at
least from the point of view of this American). I'll miss the places that are
beaches right now being beaches, and I'm not looking forward to the economic
disruption as food sources change.

We're an advanced enough society that in a nightmare scenario, we could pretty
much weather several generations in a cave under a tightly-controlled resource
regiment. But not seven billion of us, and I don't look forward to the culling
that would happen as a result of massive climate upheaval.

------
RutZap
Would be great to see a nice visualisation tool built on top of OpenStreetMaps
or something like that which allows you to see the predicted temperatures for
a given location though time. Sadly this is way over my capabilities, also a
11 TB dataset is not easy to work with :(

~~~
toomuchtodo
Would you be open to Kickstarting this project if the resulting repo was
public/open source on Github?

EDIT: I should be clear. I would not be writing the code. I'd do the legwork
to find someone (and pay them) who is familiar working with OSM to process the
data, create the dataset, and build a javascript frontend to visualize it. The
result would be open source. I would accept _no funds whatsoever_ for this.

I don't have the time to do the project, but that doesn't mean I can find
someone who does, and get them paid to do it and make the results freely
available.

~~~
mrfusion
In fact, I don't think there's even a way to get historic temperature by
day/location anywhere. That would be a useful thing to add, just the ability
to down small subsets of the data.

~~~
schiffern
[http://www.wunderground.com/history/](http://www.wunderground.com/history/)

~~~
mrfusion
It looks like you can only do one day at a time though?

My ultimate goal is to make a list of cities with less than 10 90 degree days
per year. I still don't see a way to do that on wunderground.

~~~
toomuchtodo
Check here: [https://www.ncdc.noaa.gov/data-
access](https://www.ncdc.noaa.gov/data-access)

------
AC__
It would literally take me 546 Days 19 Hours 30 Minutes 40.26 Seconds to
download this dataset. I'm sure it reads like horror though.

~~~
anc84
~2-3 days here, but I am no climate scientist so what's the point?

------
chrismarlow9
I recall reading about an investing company making a pretty penny by using
data like this to forecast the growth of certain crops.

------
andy_ppp
If the earth is 509 million square kilometers and the granularity of the data
is 25sq km this means we have about 20.3m data points. It implies then on
average each point is 540mb. The dataset seems to be broken down by year
rather than location though, a UX issue I'd say...

~~~
VLM
WRT breaking down by year, you could save some download time that way.

Lets say you express the predicted error in the data not as a percentage of
temperature but as a time along the trendline. That would imply beyond a
certain estimated error horizon, it doesn't really matter which year you
download as long as you get the decade vaguely correct, once the temperature
error figure exceeds a couple years of travel along the trend lines.

By example lets say today, here, 2080 CE, is 35C +/\- 2C (to make the math
easy) but the trend line is 0.1C/yr (unrealistic, but make the math easy) then
+/\- 2C is the same thing as saying +/\- 20 years along the trendline, so I
don't really need the 2080 data set I can do "roughly as well" with any data
set downloaded from 2070 to 2090.

Yeah yeah I know calculus and the derivative is likely not constant or even
linear and this is a really simplified way to look at statistics, but the
general plan holds as a way to cut back on downloaded data required. Based on
REAL statistical analysis you could come up with a formula that says you'll
only increase the error bars 10% wider at year X if you skip every Y years of
data. Where I'm guessing Y might exceed a decade at the extreme future years
of the run. That could save an enormous amount of bandwidth without
significantly impacting a visualization or analysis.

"Someone else" has to download it all, run the stat analysis, then tell us
all, obviously it doesn't scale for all of us to download it all, run our own
stat analysis, then go back in time and only download every Y-th year at or
beyond year X. So thanks in advance, "Someone else"!

~~~
godspiral
Its frustrating that they are not publishing the data in such chunks. Its good
that they make every year available, but for analysis/reporting, you could
reasonably just care about years 2030, 2050, 2100.

1 year of data would mean 120GB files or so (assuming 100 years total)

Breaking it down by lattitude would mean 800MB lattide-year average data sets.
Breaking it down by longitude would be very helpful too, and not much reason
to not make that breakdown in addition to the lattitude one, which means 5mb
data chunks.

Its almost as though they are being intentionally uncooperative by dumping
12TB.

Hopefully, some group makes a web interface that allows downloads in these
manageable chunk sizes.

------
jldungan
For some helpful visualizations of these data, check out
[http://climateinternational.org](http://climateinternational.org)

------
WalterGR
This article is about creating the projections and that they are available. Is
there an article describing what the projections have projected?

~~~
RutZap
I don't think anybody managed to download the whole set so far .

~~~
WalterGR
NASA has it already. Do they not want to be in the business of interpreting
climate data? Or is it too soon even for them?

~~~
caseysoftware
These are projections and the results of simulations - not fact - so they're
already interpreting and/or claiming validity of the models.

~~~
WalterGR
You are technically correct.

------
jldungan
For some helpful visualizations of these data, check out:
climateinternational.org

------
swalsh
Business idea for anyone with some time on their hands. Set up a site, and
sell harddives with this data already on it.

------
happyscrappy
NASA has released detailed global climate change projections. If you would
like to know what these projections are please drink from this data firehose.

~~~
sunwooz
I'm sure someone here is already making a visualization with the dataset.

~~~
baldfat
At 12 TB I think just getting the files would take days of downloading.

I have 25 mbs which equals to 1041:40:00 (hh:mm:ss) That is 25 mbs perfect
connection with no errors or drops. 1041 hours equals 43+ days.

[http://www.numion.com/calculators/time.html](http://www.numion.com/calculators/time.html)

~~~
a3n
Would it go significantly quicker if you downloaded it to a VPS instead of
your home?

~~~
baldfat
That would help but the cost of storage for 11 TB and the up time to download
would be extremely expensive. I haven't seen an option that works for less
than around $200 to rent and setup.

~~~
a3n
It seems like if these large public datasets continue to come on line, there
will need to be some sort of semi-cooperative distributed data store to make
them truly "accessible." Or the data provider will need to provide an
access/query API, rather than just a big tank that you can copy if you dare.

~~~
baldfat
Bit Torrent might actually help as a tool to help distribute these large data
sets.

The big issue is that so few people actually would need to download these
large dataset.

~~~
a3n
But possibly many more people could use access to the data while they use the
apps that were made possible by the data.

------
duncan_bayne
Let's see if these fare any better than the projections used by the IPCC ...

[http://esr.ibiblio.org/?p=5297](http://esr.ibiblio.org/?p=5297)

"The most recent climate model simulations used in the AR5 indicate that the
warming stagnation since 1998 is no longer consistent with model projections
even at the 2% confidence level."

Kudos to them for publishing these, though. If they _are_ rotten, it ought to
be easy to tell.

~~~
scarmig
What would be very interesting is for ESR to publish his own predictions, as
he feigns expertise.

Or, hell, any global warming denialist to publish predictions at all, or build
any model at all beyond their day jobs of quote mining and writing clever PR
releases that prey on the public's scientific illiteracy and fossil fuel
barons' desire to violate other people's lives and properties with untrammeled
carbon pollution.

~~~
IanDrake
If one's argument is that we don't know enough about what really drives our
climate to make these types of predictions, why would one then turn around and
make a prediction?

Also, using the phrase "global warming denialist" distracts from your point.
When some scientists thought neutrinos traveled faster than light, they
weren't called "light speed denialists". Have some respect for alternate
scientific opinions.

The problem with GW predictions is that most of them are wrong and the ones
that are correct aren't correct for the right reasons.

~~~
FD3SA
As an ardent defender of science, I tend to agree. There's always been a
nagging intuition I've had about climate change that to this day I haven't
resolved.

It comes down to the evolutionary history of Earth throughout life's 4 billion
year tenure on this planet. The climate has gone through EXTREME changes
[1][2], including the Chicxulub impact, which radically impacted the
atmosphere's composition and life on this planet. Yet, the biosphere has
always adapted and life continued.

Assuming that mankind is destroying the biosphere at an unprecedented rate (we
are), and burning hydrocarbons at an even faster rate (we are), are these two
factors enough to turn the earth into a Mars-style wasteland, completely
devoid of all life?

It just doesn't seem plausible to me on an intuitive level. We can use linear
regression models of the atmosphere and show temperatures rising until life
becomes impossible, but that is not how complex biospheres work.

In any case, climate science is definitely a worthy scientific discipline to
study, but it is becoming akin to economics in the prediction department. As
in, the principles are sound, but the predictions of the experts are laughably
wrong on a regular basis.

Unfortunately, this is all due to Al Gore and his "moment of genius" which
instantly politicized the scientific process, thereby destroying any potential
for real scientific progress through the feedback loop of empirical testing
and model formation. If you disagreed with Al Gore's version of climate
change, Al Gore's version of climate models, and Al Gore's version of climate
science, you suddenly became a heretic "climate change denier".

Alas, we humans really are terrible at science.

1\.
[http://www.biocab.org/Geological_Timescale.jpg](http://www.biocab.org/Geological_Timescale.jpg)

2\.
[http://earthguide.ucsd.edu/virtualmuseum/images/CO2History.h...](http://earthguide.ucsd.edu/virtualmuseum/images/CO2History.html)

~~~
hga
It happened earlier than that, or at least according to a science journalist
friend who interviewed MIT Professor Richard Lindzen in 1990 plus or minus a
year. He quoted Lindzen saying that he had thought scientists were interested
in the truth....

I'm old enough to remember when the Scientific Consensus was that we were
causing global cooling. Amusingly enough, one of those guys became one of the
most notorious global warming "scientist".

~~~
scarmig
_I 'm old enough to remember when the Scientific Consensus was that we were
causing global cooling._

At this point, that should be worth at most an eye roll. But it bears re-
iterating: never happened. Not true. And repeating it is dishonest.

~~~
hga
I guess it was just my imagination when I was coming of age starting in the
late '60s....

Try again.

------
peter303
Will it rain next weekend?

~~~
IvyMike
I dunno. But I will bet you that August 2021 in Chicago will be warmer on
average than February 2022.

~~~
joeclark77
I predict it will also be more than 10% longer.

