Hacker News new | past | comments | ask | show | jobs | submit login
WRF Official Repository – Weather Research and Forecasting (github.com/wrf-model)
54 points by lemaudit on March 3, 2019 | hide | past | favorite | 16 comments



I'm curious.

I believe national-wide weather forecasting by computers was been done since the 1960s. As today's personal workstations are much more powerful than a supercomputer built in the 1990s, and the old models and numerical methods (not the state-of-art ones) are well-known (and have multiple open-source implementations), the calculation should be possible today on an ordinary workstation for educational purposes.

But exactly what kind of data do I need to run my own weather forecast on my computer? Is it possible to calculate my personal 12-hour weather report based on public data published by NOAA, etc? Is there a tutorial for setting up a weather forecast?


If you want to set up and use the WRF model linked in the submission to generate your own forecasts, you can follow this tutorial: http://www2.mmm.ucar.edu/wrf/OnLineTutorial/index.php

For all but the most extreme configurations, WRF will run on a modern 2-4 core Linux desktop. It will be fairly slow, but it will run.

You can use the raw data from the official model runs published by NOAA as initial and boundary conditions for your model runs: https://nomads.ncep.noaa.gov . One of the coolest and most under appreciated things about NOAA is that they publish everything online for free for everyone.

Or you can just get the raw data from the official runs from the link above and do your own extraction, and maybe post processing if you like.

(Source: PhD in meteorology. Finally made an account when I saw this posted)


A corollary, would faster hardware help NOAA & friends? Do they already have the best available? Or are the models not able to take advantage of extra computational power?


(Disclaimer: I don't work in operational modeling)

NOAA just upgraded last year to a system that hits 8.4 petaflops, which is about the same as the European system, and the Japanese and UK systems are fairly similar.

More computing power would absolutely help the NOAA models (primarily GFS) increase resolution, improve the data assimilation method that generates the initial conditions, and increase the number of ensembles run. The GFS lags the European model in all of these areas. It would especially help increase the number of ensembles, since that is an embarrassingly parallel problem. However, NOAA also needs more researchers and funding. For example, model configuration changes not only need to be developed, but also tested to ensure that there aren't unexpected regressions in forecast skill. And, for example, the Europeans have put a lot of research and development into their data assimilation method, and it's one of the reasons they tend to outperform other models.


New York Times has this 2016 article on how the European model is more advanced in many respects:

https://www.nytimes.com/2016/10/23/magazine/why-isnt-the-us-...

And also this blog post by Cliff Mass:

https://cliffmass.blogspot.com/2016/10/us-operational-numeri...


More computational power is definitely better but that's not the whole story. For example the initial conditions of the model are really important.


University of Washington config: Hardware: Forecasts are computed on a cluster of computers:

A 16-node (116 Intel Xeon 2.3GHz processors) Linux PC Beowulf cluster

https://a.atmos.washington.edu/~ens/uwme_info.html


Disclaimer: not a meteorologist.

> But what kind of data do I need to run my own weather forecast on my computer?

I believe it's a combination of plain old measurements (wind direction + speed, temperature, humidity, etc etc), RADAR, and a lot of historic precedence. I don't know how much of that is open-sourced, but I think it'd be easier to hook onto the api of your national weather broadcasting websites and ask them what your weather will be for the next 12 hours.


Most of the data comes from satellite observation. The US does have a very high density of RADAR as well. Sounding balloons and fixed weather stations are the traditional inputs. There is no explicit historical data, though arguably it is present in the use of particular schemes (weather speak for: we tweaked these magic numbers in the model so it gave good forecasts).

You can easily run WRF with the NOAA GFS as initial conditions, it would easily run on an iPhone computationally, but practically it will work on your desktop Linux box.

http://www2.mmm.ucar.edu/wrf/users/supports/tutorial.html

They have a containerized version too, which will save much fiddling with compilers and libraries.

https://github.com/NCAR/container-wrf/


It may be possible, with a lot of work, to get the model up and running, but to get any type of accuracy out of the model, realtime data assimilation of observations is key and that is possibly even more complex. So I'll agree with the comment that it may be easier to just hook into a national weather service API. Of course, if you just want to learn about getting these models up and running then it's worth trying. (I work at a national met service developing these models.)


ofc it won't be reliable at all, especially compared to the national weather service. I meant doing it for educational purposes, for fun, not for profit, so I'm here to ask about the types of data needed and the availability of these data, etc, for running such forecast (not meant to use this specific model, I think it would be impressive to replicate how numerical weather forecasting was done in the 1970s on today's computer with real data).


Check out the UEMS here. It makes running your own weather model a lot easier

Http://Strc.comet.ucar.edu/index.html


WRF is a state-of-the-art atmospheric modeling system designed for both meteorological research and numerical weather prediction. It offers a host of options for atmospheric processes and can run on a variety of computing platforms. WRF excels in a broad range of applications across scales ranging from tens of meters to thousands of kilometers, including the following.

– Meteorological studies

– Real-time NWP

– Idealized simulations

– Data assimilation

– Earth system model coupling

– Model training and educational support


It doesn't seem to be a deeplearning/neural network model like I was half expecting ;)


Some other comments mention that European models are more advanced in some respects. Are any of those open source?


No, the European model (ECMWF IFS) is not open source, and they don't even release most of the operational forecast data that they produce to the general public. As far as I know the UK and others don't either.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: