I believe national-wide weather forecasting by computers was been done since the 1960s. As today's personal workstations are much more powerful than a supercomputer built in the 1990s, and the old models and numerical methods (not the state-of-art ones) are well-known (and have multiple open-source implementations), the calculation should be possible today on an ordinary workstation for educational purposes.
But exactly what kind of data do I need to run my own weather forecast on my computer? Is it possible to calculate my personal 12-hour weather report based on public data published by NOAA, etc? Is there a tutorial for setting up a weather forecast?
For all but the most extreme configurations, WRF will run on a modern 2-4 core Linux desktop. It will be fairly slow, but it will run.
You can use the raw data from the official model runs published by NOAA as initial and boundary conditions for your model runs: https://nomads.ncep.noaa.gov . One of the coolest and most under appreciated things about NOAA is that they publish everything online for free for everyone.
Or you can just get the raw data from the official runs from the link above and do your own extraction, and maybe post processing if you like.
(Source: PhD in meteorology. Finally made an account when I saw this posted)
A corollary, would faster hardware help NOAA & friends? Do they already have the best available? Or are the models not able to take advantage of extra computational power?
(Disclaimer: I don't work in operational modeling)
NOAA just upgraded last year to a system that hits 8.4 petaflops, which is about the same as the European system, and the Japanese and UK systems are fairly similar.
More computing power would absolutely help the NOAA models (primarily GFS) increase resolution, improve the data assimilation method that generates the initial conditions, and increase the number of ensembles run. The GFS lags the European model in all of these areas. It would especially help increase the number of ensembles, since that is an embarrassingly parallel problem. However, NOAA also needs more researchers and funding. For example, model configuration changes not only need to be developed, but also tested to ensure that there aren't unexpected regressions in forecast skill. And, for example, the Europeans have put a lot of research and development into their data assimilation method, and it's one of the reasons they tend to outperform other models.
> But what kind of data do I need to run my own weather forecast on my computer?
I believe it's a combination of plain old measurements (wind direction + speed, temperature, humidity, etc etc), RADAR, and a lot of historic precedence. I don't know how much of that is open-sourced, but I think it'd be easier to hook onto the api of your national weather broadcasting websites and ask them what your weather will be for the next 12 hours.
Most of the data comes from satellite observation. The US does have a very high density of RADAR as well. Sounding balloons and fixed weather stations are the traditional inputs. There is no explicit historical data, though arguably it is present in the use of particular schemes (weather speak for: we tweaked these magic numbers in the model so it gave good forecasts).
You can easily run WRF with the NOAA GFS as initial conditions, it would easily run on an iPhone computationally, but practically it will work on your desktop Linux box.
It may be possible, with a lot of work, to get the model up and running, but to get any type of accuracy out of the model, realtime data assimilation of observations is key and that is possibly even more complex. So I'll agree with the comment that it may be easier to just hook into a national weather service API. Of course, if you just want to learn about getting these models up and running then it's worth trying. (I work at a national met service developing these models.)
ofc it won't be reliable at all, especially compared to the national weather service. I meant doing it for educational purposes, for fun, not for profit, so I'm here to ask about the types of data needed and the availability of these data, etc, for running such forecast (not meant to use this specific model, I think it would be impressive to replicate how numerical weather forecasting was done in the 1970s on today's computer with real data).
WRF is a state-of-the-art atmospheric modeling system designed for both meteorological research and numerical weather prediction. It offers a host of options for atmospheric processes and can run on a variety of computing platforms. WRF excels in a broad range of applications across scales ranging from tens of meters to thousands of kilometers, including the following.
No, the European model (ECMWF IFS) is not open source, and they don't even release most of the operational forecast data that they produce to the general public. As far as I know the UK and others don't either.
I believe national-wide weather forecasting by computers was been done since the 1960s. As today's personal workstations are much more powerful than a supercomputer built in the 1990s, and the old models and numerical methods (not the state-of-art ones) are well-known (and have multiple open-source implementations), the calculation should be possible today on an ordinary workstation for educational purposes.
But exactly what kind of data do I need to run my own weather forecast on my computer? Is it possible to calculate my personal 12-hour weather report based on public data published by NOAA, etc? Is there a tutorial for setting up a weather forecast?