I grew a big-assed tornado on a supercomputer. How fucking cool is that? Can't do that on an Apple ][.
I grew a big-assed tornado on a supercomputer. How fucking cool is that? Can't do that on an Apple ][.
1. Two factor authentication, ALWAYS
2. People should stop using email for anything sensitive that you don't want read by your worst enemy. Use some P2P encrypted chat program or something. One would think Americans, at least, could see the value in something other than damned emails for sensitive communication.
All it will take is one sustained power outage, and that's all she wrote.
The amount of power consumption required to keep things that cold must be enormous. What a waste.
Bullshit. Did you even look at the article?
Show me your trend. Please.
If you are looking at the high end tornado figure, there appears to be a weak downward trend over several decades, but 2011 just rang and asked if El Reno, Joplin and Tuscaloosa wanted to come out and play.
You may turn CP off but you're sure as hell still parameterizing (microphysics!! And just how many choices do you have for those fun knobs!). And 4km is still pretty damned coarse for thunderstorms. But yeah there is clearly more to surface wind intensity than eyewall replacement, that's just an example of something that is thought to play a role and that is definitely handled better with finer meshes (tropical is not my expertise).
We're doing much better, but we've still got a lot of work to do to get hurricanes right - some of it is with the models, some of it (more, in my opinion) has to do with what we feed the models. Hoping GOES R will be a big help. Pray for a launch that doesn't blow up on the launchpad.
Good riddance to cumulus parameterization, when that day comes in everything but climate models (let's not get too carried away) I will dance with glee on reams of printouts of CP code. Microphysics is a big enough kludge but I don't think we're going to be following raindrops and snowlfakes around in our models for a long, long time.
The path of a hurricane is somewhat unpredictable (been known to turn 90 degrees for no apparent reason).
We've gotten much better at predicting the paths of hurricanes which are, to a first degree of approximation, steered by larger-scale winds that have gotten easier to predict with time because of improved observational data feeding models, as well as the fact that they are easier to resolve and are dictated by things that don't require lots of parameterization (like what you have with convective clouds and precipitation). What we continue to struggle with is the strength of the surface winds over time, which of course is highly desired at landfall. There are small scale processes going on within hurricanes (involving what meteorologists call eyewall replacement cycles) that modulate surface winds and that are less understood and much more difficult to forecast correctly. Basically, the smaller it is, the harder it is to forecast over long time periods.
If I had to choose between getting one or the other right, though, I'll choose path over surface wind strength. Getting out of the way of a Category 2 vs getting out of the way of a Category 5 is pretty much the same process!
You are basically describing ensemble forecasting, which is very powerful for providing statistically meaning forecasts where you can intelligently talk about the uncertainty of the forecast, something single deterministic forecasts cannot do.
In my research, I'm doing single deterministic forecasts to study what happens with tornadoes in supercell thunderstorms, where I am cranking up the resolution to capture flow that is otherwise unresolved. I get one version of a particular storm, which is good for studying certain aspects of storms, but not good at being able to generalize (that takes lots of simulations).
Both big deterministic simulations and ensembles have their place. Of course, today's big simulation can be the resolution of tomorrow's ensembles! Right now, you can do lots of good science with ensembles. Operationally (weather forecasting) this is basically the new paradigm, although forecasters are slow to change from just looking at the single deterministic GFS and NAM forecasts. The ensemble approach, once we start running hundreds of forecasts at higher resolution that we do today, will transform our forecasting accuracy (and precision). However it will be limited to the amount of good observational data we can feed the models (otherwise GIGO). This is where remote sensing comes in. GOES-R will be a big help.
It will indeed take people from atmospheric science, computer engineering, software engineering, etc. working together to best exploit exascale machines. NCSA understand this and that's what makes it (and other similar organizations) great.
One of the biggest problems of the current large scale HPC machines is users (like you but maybe not you specifically) are typically scientists/analysts who write software that does not scale well. There either needs to be better frameworks for you to work within that handle all the grunt work of doing efficient parallelization and message passing or every atmospheric physicist needs to be teamed with a computer scientist and a software engineer.
Absolutely agree 100%!
You can say "one of" but you can't say "the fastest" petascale machines my friend
I should have added "on a college campus".
My main point is, just throwing more cores at "mostly MPI" weather models is not sustainable. We are going to need to be much smarter about how we parallelize.
Weather guys want this after NSA's done.
I'm a weather guy - running cloud model code on Blue Waters, the fastest petascale machine for research in the U.S. I don't think we've managed to get any weather code run much more than 1 PF sustained - if even that. So it's not like you can compile WRF and run it with 10 million MPI ranks and call it a day. Ensembles? Well that's another story.
Exascale machines are going to have to be a lot different than petascale machines (which aren't all that different topologically than terascale machines) in order to be useful to scientists and in order to no require their own nuclear power plant to run. And I don't think we know what that topology will look like yet. A thousand cores per node? That should be fun; sounds like a GPU. Regardless, legacy weather code will need to be rewritten or more likely new models will need to be written from scratch in order to do more intelligent multithreading as opposed to mostly-MPI which is what we have today.
When asked at the Blue Waters Symposium this May to prognosticate on the future coding paradigm for exascale machines, Steven Scott (Senior VP and CTO of Cray) said we'll probably still be using MPI + OpenMP. If that's the case we're gonna have to be a hell of a lot more creative with OpenMP.
chinook:% echo $0
Now I'm totally confused. Zero dollars equals bash. Bash what? Bash head into keyboard? ORP BASH!
It is impossible to to be objective about a subjective experience.
No, it is not.
My only point is really undebatable. If, for instance, I want to listen to Wilderness Road's eponymous album, I have to listen to the vinyl, or somebody's copy of it. And it's a fucking great album and I don't need someone telling me that I'm an idiot for wanting to be able to listen to it.
Some real nutty music fans still spin 78s, can you believe it? Lot of music on those shellac platters never made it to any other format.
You cut yourself out of a surprisingly large amount of music (I'm talking about old stuff mostly) if you don't have a turntable.
I'm a self-proclaimed "audiophile" but not in the annoying, trust-my-ears-only way that plagues the hobby (I'm a scientist, dammit). I have a nice tube amp, great speakers, subwoofer, etc.... and I have a turntable as well (and a network enabled player + nice DAC). Anyhoo.... I can speak to the non-hipster side of things. Yes, some of the growth of vinyl has a faddish aspect to it. But, keep in mind, many musicphiles and audiophiles never stopped collecting and buying vinyl even through the meteoric rise of CD.
If you are a major music fan (and do not have an unlimited supply of pirated needledrops on the internet), a turntable is essential. A lot of obscure stuff was never released on CD. A lot of stuff that was released from the past on CD sounded (and continues to sound) dreadful due to the mad scramble to ride the CD wave; nth generation tapes, some equalized for vinyl, were used as the source material. Thankfully a lot of stuff these days that is selling is remastered versions of old stuff from original master tapes (not copies). You can be cyincal about this (say the major labels are just milking old warhorses) and you can also acknowledge that the digital audio technology has increased astoundingly since the late 80s and 90s. What does this have to do with vinyl? Well, vinyl can sound really good if done well. I won't argue that it is a better medium than digital; it simply isn't. But it has its own charms.
I have bought vinyl reissues that were mastered very well, and the vinyl was quiet, lacking surface noise - but about a third of the time I get burned with either lousy mastering (sibilance and related issues - and I have a very good microline cart) or more commonly, ticks and pops in shrinkwrapped new vinyl (and run through a we clean). This is the way it has always been and will always be with vinyl.
A primary motivation I have for buying new vinyl releases of new music is to acquire recordings that haven't been as dynamically squashed in the digital mastering process. While vinyl releases can be very dynamically compressed as well, as a rule, vinyl releases tend to be mastered with more dynamic range than the digital version (you could argue that this is partly, or mostly due, to physical limitations of the vinyl medium). And yes, I acknowledge that most vinyl is either digitally sourced or goes through an ADA conversion.
But mostly I continue to buy vinyl because it's fun - it's part of a hobby I enjoy very much. Spending hours just sitting "in the sweet spot" and listening to music (from any source - digital, tape, vinyl or whatever) is something I enjoy. So while people scoff at the vinyl "revival" I'm just glad to see there are more choices our there for getting good sounding music.
Take a look, there is some neat stuff going on with Blue Waters: https://bluewaters.ncsa.illino...
Most science is not breakthroughs; it's usually slow progress, with many failed experiments.
These computers are facilitating a lot of good science and increases like this in our computational infrastructure for research is great news. I do wonder how they are going to power this beast and what kind of hardware it will be made of. 300 PFlop is pretty unreal with today's technology.
Top Ten Things Overheard At The ANSI C Draft Committee Meetings: (6) Them bats is smart; they use radar.