I worked as a forecaster for a bit but never made it to the research world (studied theoretical pde instead of computational)... however at the time huge gains had been made in data assimilation. One fact that has stuck with me was that ~1/3 of the computation time for the UK Met global model run was consumed by data assimilation. I don't remember statistics anymore but data assimilation schemes were a big driver of improved forecast skill.
I also recall the ECMWF had surprisingly accurate long range forecasts based on ensembles. It could predict 500mb heights out two weeks, no sweat.
Re: your comments... My guess is that a gpu isn't suited for use in an operational model due to data access patterns (and possibly not even helpful with the solver). But again, I'm not a computational pde guy. Also, perhaps machine learning would be useful but that would be post-processing or perhaps parameterizing sub-grid phenomenon. There's already a process called model output statistics (MOS) for adjusting raw fields from a weather model.
The physics is pretty well known at this point, and there's only so much you can gain by increasing from second to third order approximation. The errors in the initial conditions are just larger. Most of the action has been on data assimilation and better parameterizations because of that.
I've been out of the field for ten years now, but it's really nice to see improvements to the core physics to this degree.
I'm still skeptical of your supposed two week 500 heights forecast from the ECMWF model. I live near the western Pacific (i.e. the data hole) and it's really easy to find crazy model solutions after 7 days. And I'm pretty sure you weren't looking at the Southern Hemisphere.
> I'm still skeptical of your supposed two week 500 heights forecast from the ECMWF model.
You're probably right to be skeptical, for the record I was only a forecaster for a short period of time over ten years ago... didn't even serve my full four year commitment as I volunteered to get out under the Air Force "force shaping" at the time. I was stationed near Rammstein and we created forecasts for Europe. I was referring to the ECMWF ensemble products, specifically.
I've done computational physics at the grad level. There, PDEs are converted to finite difference which basically leads to giant sparse linear matrices. These are solved using SOR or even more advanced numerical techniques. These techniques tend to be quite GPU friendly.
Well, if you're just doing a standard finite difference method, and you have to keep shuffling your matrices between CPU and GPU because other operations don't work well on GPUs, you actually won't have any speedup.
Where GPUs shine for PDEs is if you have a lot of extra work for each node, for instance if you have complex chemical reactions or thermodynamics, or if you have a high-order method that requires lots of intermediate computations.
If you don't believe me, you can download the PETSc code and test the ViennaCL solvers versus the regular ones.
I also recall the ECMWF had surprisingly accurate long range forecasts based on ensembles. It could predict 500mb heights out two weeks, no sweat.
Re: your comments... My guess is that a gpu isn't suited for use in an operational model due to data access patterns (and possibly not even helpful with the solver). But again, I'm not a computational pde guy. Also, perhaps machine learning would be useful but that would be post-processing or perhaps parameterizing sub-grid phenomenon. There's already a process called model output statistics (MOS) for adjusting raw fields from a weather model.