Weather guys want this after NSA's done.
I'm a weather guy - running cloud model code on Blue Waters, the fastest petascale machine for research in the U.S. I don't think we've managed to get any weather code run much more than 1 PF sustained - if even that. So it's not like you can compile WRF and run it with 10 million MPI ranks and call it a day. Ensembles? Well that's another story.
Exascale machines are going to have to be a lot different than petascale machines (which aren't all that different topologically than terascale machines) in order to be useful to scientists and in order to no require their own nuclear power plant to run. And I don't think we know what that topology will look like yet. A thousand cores per node? That should be fun; sounds like a GPU. Regardless, legacy weather code will need to be rewritten or more likely new models will need to be written from scratch in order to do more intelligent multithreading as opposed to mostly-MPI which is what we have today.
When asked at the Blue Waters Symposium this May to prognosticate on the future coding paradigm for exascale machines, Steven Scott (Senior VP and CTO of Cray) said we'll probably still be using MPI + OpenMP. If that's the case we're gonna have to be a hell of a lot more creative with OpenMP.
I'm not a weather guy, but my understanding is that a somewhat fixed weather model (set of calculations) is used to do a kind of finite-element analysis on small areas. With better computing and better radars, smaller and smaller areas can be calculated, which results in more accuracy.
With more computing power, could you not vary the parameters or constants used in the weather model, then run the finite-element analysis over the entire weather area again? You could be running hundreds or thousands of slightly different weather models, then apply some processing to figure out which is most likely- either by averaging together the 50% most similar outcomes, or by some other method. I don't think you could peak out a supercomputer with that method if you kept adding more parameter variations, although you may get to the point where adding more parameter variations doesn't improve accuracy.
Maybe that's an incorrect understanding, but we're getting closer to the point where we can calculate all possible outcomes simultaneously. I wouldn't have expected this to be the case with weather but computing has come a long way in the last 20 years.