Follow Slashdot blog updates by subscribing to our blog RSS feed

 



Forgot your password?
typodupeerror
×

Comment Re:Benchmarks are bad metrics (Score 1) 258

As an aside - we just bought a couple of OCZ Revodrive 3 x2 (1TB each) cards and have been using them and benchmarking them over the last couple of days for scientific data analysis... DAMN they are fast! We're getting about 1.2GB/s (yes Bytes with a big B!) consecutive reads (which is was our main purpose happens to be).

The only downside we've found is spotty Linux (which, along with OSX is all we use... no Windows here) driver support. We have to actually use the commercial drivers for the Vertex and ZDXL... which are only precompiled for specific Linux kernel versions. Other than that the cards have worked great!

A bit back on topic - if this "turbo" mode were something any app could invoke somehow (with an API call for instance) then this wouldn't be a problem... but since they've only made it work with explicitly named executables it feels a bit underhanded....

Comment Re: Storage. (Score 1) 232

The only drives in my work machine are 3x512GB SSD's in a RAID0 array. This is to deal with datasets in the 300GB range that my code outputs as it runs on supercomputers (10,000+ cores).

When you're trying to make an animation that needs to read all 300GB serially through a file like that SSD's are a godsend.

Just last week I purchased a new workstation for tens of thousands of dollars (don't want to put the exact amount on here). It contains a 1TB "Revo" PCIE card (extremely fast SSD chips that plug into PCIE), 512GB of RAM and a Nvidia Quadro K6000 and a K5000.... all to accelerate this same workload...

Just because you can't think of workloads that would be useful with solid state drives doesn't mean they don't exist!

Comment Re: Storage. (Score 1) 232

I actually agree with this. I do large compiles all day long and when I switched to a 3xSSD RAID0 array I didn't see any improvement in compile time (but it did speed up everything else I do with large data loads). This is on a 12 core Mac Pro... so plenty of horsepower to keep the disks working during a compile.

In order to speed up compiling I just set up a 150+ core distcc array using 13 Mac Pros... THAT sped my compiling up by an order of magnitude!

Moral of the story: compiling is cpu bound.

Comment Simulation Visualization (Score 1) 41

I write massively parallel scientific simulation software for a living (the kind that runs on the biggest machines in the world)... and trying to come up with a way to display GBs or TBs of information from some of our largest simulations can be _tough_.

We use several open source packages ( Mostly http://www.paraview.org/paraview/resources/software.php and https://wci.llnl.gov/codes/visit/ ), but most of our best visualizations are actual done using a commercial package ( http://www.ceisoftware.com/ )

For some examples check out the YouTube video here: http://www.youtube.com/watch?v=V-2VfET8SNw

(That's me talking in the video). Those aren't necessarily our best visualizations - just some that happen to be on YouTube...

We find that the reactions to these simulations are mixed. They are certainly eye-pleasing... but sometimes if you go too far in making it look good it can actually turn scientists off. They will start to think that it looks "too good to be true" (I literally had a senior scientist in a room of 200 stand up at the end of one of my presentations and proclaim that "This is too good to be true!"). Because of this we try to do do just enough visualization that you can see all of the features of the simulation and understand what's happening without going overboard.

You have to realize that a lot of scientists still remember the days when they created line plots _by hand_ for publications! I suspect that as young scientists come up through the ranks this feeling that "slick graphics = not real" will go away.

At least, I hope....

Comment Re:'Simple really... (Score 1) 775

Google Glass actually _helps_ here. If you were wearing one then you can show them exactly where you were at that time....

This is essentially "counter" surveillance that can prove all sorts of stuff about your innocence.

Have you seen all of the videos of Russion car dashcams? Do you know why they have those? To _protect_ themselves from the police (and other drivers).

The principle is the same here...

Comment Re:A pellet stress simulation? (Score 1) 84

Pellets, as manufactured, are _very_ smooth. This is a decent overview I just found from Google: http://www.world-nuclear.org/info/Nuclear-Fuel-Cycle/Conversion-Enrichment-and-Fabrication/Fuel-Fabrication/#.UVmkjas5yZc

They start life as powder and then are packed in a way that makes them smooth.

However, just as in any kind of manufacturing: defects happen. A working reactor will have over a million pellets in it. Somewhere in there one is going to be misshapen.

Some of what we can do is run a ton of statistically guided calculations to understand what kind of safety and design margins need to be in place to keep problems from occurring. We can also look at modifying the design of the pellets to insure safer operation. Both of these things are very difficult (and costly) to do experimentally.

My lab (INL) does a lot of experimental fuel work... but we use these detailed simulations to guide the experiments so we can use our money more wisely. It literally takes years to develop a new fuel form, manufacture it, cook it in an experimental reactor, let it cool down, slice it open and see what happened. Using these detailed simulations we can do a lot of that "virtually" to help them decide on experimental parameters so that at the end of that whole sequence they have a bunch of _very_ good experimental results instead of half of them just being failures...

Also, we do actually have a bunch of detailed experimental results to compare our simulations to. Even with this fidelity of modeling we are still not able to perfectly capture what happens in all of those experiments. Even more detailed models (like the multiscale one in the video) need to be developed to be able to truly predict all the complex phenomena that goes on in nuclear fuel.

There is still a LOT more work to do...

Comment Re:A pellet stress simulation? (Score 1) 84

Thanks!

Certainly the nuclear reactor industry has done "just fine" without these detailed calculations for the last 60 years. Where "just fine" is: "We've seen stuff fail over the years and learned from it and kept tweaking our design and margins to take it into account". They have use simplified models to get an idea of the behavior and it has worked for them (as far as the reactors run safely and reliably).

However, the "margins" are the name of the game here. If you can do more detailed calculations that take into account more physics and geometry you can reduce the margins and provide a platform for creating the next reactor that is both more economical and safer. If you can increase the operating efficiency of a nuclear reactor by even 1% that is millions of dollars. If you can keep something like Fukushima from happening that is even more money (some would say "priceless").

The approximate answers (using simplified models) are good - they are in the ballpark. But if you compare their output to experimental output (which we have a LOT of... and it is VERY detailed) the simplified models get the trends right... but miss a lot of the outlier data. That outlier data is important... that's where failure happens. With these detailed models we get _much_ closer to the experimental data.

To get even closer to the experimental data we have to get even more detailed. The movie showed some of our early work in multi-scale simulation: where we were doing coupled microstructure simulation along with the engineering scale simulation. That work is necessary to get the material response correct to get even closer to the experimental data.

Ultimately, if we can do numerical experiments that we have a great amount of faith in, it will allow us to better retrofit existing reactors to make them more economical and safe and design the next set of reactors.

Comment Re:And it didn't need to be (Score 2) 84

I know I shouldn't respond to AC's but I'm going to anyway:

And it didn't need to be.

As far as geometry goes, it did need to be that detailed. Firstly, the pellets are round and to get the power and heat transfer correct you have to get the geometry correct. Also, pellets have small features on them (dishes on top and chamfers around the edges) that are put there on purpose and make a big difference in the overall response of the system (the dishes, in particular, reduce the axial expansion by a lot). So the detailed geometry is a very important part of this simulation. But that's not the only reason why it's large.

Your simulating a simple heat transfer and simple expansion, NOTHING MORE, no different that any other chemical process simulation in any other factory. Just with a lot more nodes.

I already explained how that is not the case. These are fully-coupled, fully-implicit multiphysics calculations. It is _not_ just heat conduction going on. Very complicated processes like fission gas creation, migration and release and fission induced and thermal creep, and fission product swelling are all involved. Plus the heat conduction and solid mechanics and thermal contact and mechanical contact and fluid flow model (on the outside of the pin) and conjugate heat transfer. All of these processes feed and are impacted by each other. These are NOT simple calculations.

It's also an arbitrary simulation serving no purpose. You said "what is that panel is broken right there' then ran a simulation with a stupid number of nodes to soak up a computer. But the pellet was made, it exists, it didn't need your simulation to be made and the simulation make zippo difference. You can run any number of similar simulations with the damage in an infinite number of places or combination of places, and it makes zip difference to the world because you don't know where each pellet is damaged. So NONE of your simulations apply to the actual pellet.

Actually, you are very wrong. Firstly, the Missing Pellet Surface problem is a huge problem in industry. What we can do with simulation is explore boundaries of how much tolerance there can be for such missing surfaces. We can vary the missing surface size and run thousands of calculations to determine the sizes that operators need to worry about. They can then adjust their QA practices to take this information into account. We can also run simulations of full reactors and stochastically sprinkle in defect pellets and show the overall response of the system which can help in understanding how to bring a reactor back up to full power in a safe way after refueling.

As for "that pellet exists"... firstly that's not true... but even if it did, doing experiments with nuclear fuel is _very_ costly and takes years (that is something else we do at INL) in order to better target our experimental money we do simulation to guide the experiments.

Their mission statement is absolutely clear. Turn cold war spending into security theatre spending and that's your job.

I don't work in security.... there are many national labs, all with different missions, but they _all_ do non-security work. They all work with US industry to solve some of the toughest problems on the planet. They are all full of extremely smart people and they are all working to add to the competitive advantage of the US. I'm sorry that you feel that way, but if you are interested in learning more about the national labs you should get a hold of me.

Comment Re:A pellet stress simulation? (Score 5, Informative) 84

I don't get it are you looking for a Funny mod? You linked to a 2D heat transfer simulation done by Matlab. Did you even watch the video?

The second simulation (of a full nuclear fuel rod in 3D) was nearly 300 million degrees of freedom and the output alone was nearly 400GB to postprocess. It involves around 15 fully coupled, nonlinear PDEs all being solved simultaneously and fully implicitly (to model multiple years of a complex process you have to be able to take big timesteps) on ~12,000 processors.

Matlab isn't even close.

Comment Re:throw away mentality (actual arcticle link) (Score 4, Interesting) 84

It costs a _lot_ to keep these computers running (read Millions with a really big M). The power bill alone is an enormous amount of money.

It literally gets to the point where it is cheaper to tear it down and build a new one that is better in flops / Watt than to keep the current one running.

Comment Re:Top supercomputer is Google? (Score 5, Informative) 84

I've worked for the DOE for quite a few years now writing software for these supercomputers... and I can guarantee you that we use the hell out of them. There is usually quite a wait to just run a job on them.

They are used for national security, energy, environment, biology and a lot more.

If you want to see some of what we do with them see this video (it's me talking):

http://www.youtube.com/watch?v=V-2VfET8SNw

Slashdot Top Deals

To the systems programmer, users and applications serve only to provide a test load.

Working...