Please create an account to participate in the Slashdot moderation system

 



Forgot your password?
typodupeerror
×

Comment "divergent package manager paradigms" (Score 1) 142

Oh great, I just heard about this potentially useful new tool, and it's already been forked into competing factions!

As of Emacs 24, package management is integrated, but yet again there are divergent package manager paradigms (EL-Get & ELPA) and a number of repositories exist for these. They are not pre-configured.

Comment Re:I would start looking at the algorithms (Score 3, Informative) 161

With the known chaotic nature of storm systems it wouldn't surprise me if the "butterfly effect" of the rounding errors when converting from C to F would be enough to displace a storm by hundreds of miles!

Absolutely not the case. First, all non-trivial computational fluid dynamics codes (e.g. those used for weather prediction) use non-dimensionalized values in the governing equations. You're not solving with C vs F (you'd never use either anyway, but absolute kelvin vs rankine), or meters vs feet, but non-dimensional numbers which are only converted back to dimensional ones after all the heavy computation is done.

Secondly, even if one were to use dimensional values in solving the equations, the round off errors for converting between C and F are many, many orders of magnitude smaller than the errors you get in the discretization of the original continuous system of equations.

Lots of comments here regarding metric vs. imperial units; I assure you that accuracy discrepancies between the European and American predictions have absolutely nothing whatsoever to do with any choice of unit system.

Source: I'm a CFD researcher =)

Comment Re:Breaking laws (Score 4, Insightful) 218

Honestly, if you are just going to China to break their laws, why not just stay at home? If you still want to continue then don't break immigration and other laws in the country you are visiting. It's not only illegal but greatly distasteful towards the host country. They are welcoming you as a visitor and yet you are just going to be breaking laws.

“One has a moral responsibility to disobey unjust laws.”
  Martin Luther King Jr.

Comment Re:Topology matters more than GFLOPS (Score 1) 59

Yes, many problems can be expressed as dense linear algebra, and so measuring and comparing LINPACK perf for these makes sense for those problems. However, many problems don't map well to dense linear algebra.

Sure, but as far as I've seen, linear algebra problems dominate the runtime of these very large systems. That's what I use them for.

At least the first 6 on that dwarfs list are done daily on top500 machines. I write parallel spectral methods, and use structured and unstructured grids. Achieving high scaling on these on massively parallel machines is not at all what I would call an open problem (as far as correctly using a given network for a problem, or designing a network for a given problem). For any given network topology, breaking these problems into parallel chunks that are best for the particular topology is just a graph partitioning problem with a trivially sized graph. That little white paper strikes me as being somewhere between "obvious to anyone that's done high performance computing in the last 40 years" , "putting a cute name on old hat", and just plain silly.

Comment Re:Topology matters more than GFLOPS (Score 3, Informative) 59

I'll drill into an example. If you're doing a problem that can be spatially decomposed (fluid dynamics, molecular dynamics, etc.), then you can map regions of space to different processors. Then you run your simulation by having all the processors run for X time period (on your simulated timescale). At the end of the time period, each processor sends its results to its neighbors, and possibly to "far" neighbors if the forces exceed some threshold. In the worst case, every processor has to send a message to every other processor. Then, you run the simulation for the next time chunk. Depending on your data set, you may spend *FAR* more time sending the intermediate results between all the different processors than you do actually running the simulation. That's what I mean by matching the physical topology to the computational topology. In a system where the communications cost dominates the computation cost, then adding more processors usually doesn't help you *at all*, or can even slow down the entire system even more. So it's really meaningless to say "my cluster can do 500 GFLOPS", unless you are talking about the time that is actually spent doing productive simulation, not just time wasted waiting for communication.

Considering that computational fluid dynamics, molecular dynamics, etc., break down into linear algebra operations, I'd say that the FLOPS count on a LINPACK benchmark is probably the best single metric available. In massively parallel CFD, we don't match the physical topology to the computational topology, because we don't (usually) build the physical topology. But I can and do match the computational topology to the physical one.

Comment Saving us the trouble of reading comments (Score 5, Funny) 543

Poster 1) Unity is and always will be an unholy mess.
Poster 2) Unity is a massive leap forward in modern functionality, and anyone that simply gives it an honest try will agree.
Poster 1) I have tried. I don't want to learn new things and shouldn't have to. I had to switch to xfce.
Poster 3) APPLE APPLE APPLE
Poster 4) Seriously, Windows 8? Really?
Poster 5) You all should really give gnome3 another chance, it's really almost acceptable to use now.
Poster 1) Ubuntu is dead to me.
Poster 6) Remember NeXT?

Comment Computational methods in plasma/tokamaks (Score 1) 318

Particularly for Nathan,

I'm a PhD student in computational fluid dynamics, and I've recently grown interested in plasma simulations. My experience stems largely from aeronautics, where CFD methods are very well studied and well validated for large ranges of turbulent flows. In these cases, we often have well detailed experimental data for verification; of what I know about tokamaks, this level of experimental detail is somewhere between nonexistent and impossible to acquire.

Do you see techniques from direct numerical simulation (DNS) or large eddy simulation (LES) becoming important as a primary tool in tokamak/plasma research as they presently are in more standard fluids? Can you remark on some of the computational challenges that makes tokamaks more difficult than say, multi-species combustion simulations?

Comment Re:Chaotic Systems (Score 2) 65

Ok, so I assume you use a Monte Carlo technique to generate probabilities of outcomes. But does having supercomputers improve the accuracy of the results, with any certainty?

Well yes, of course. That's the entire purpose of uncertainty prediction, and HPC simulations in general. In any kind of complex numerical simulation (say, turbulent aerodynamics), the accuracy with which you can simulate a given physical situation is entirely constrained by your available computational power. You must find a balance between the level of detail you need versus computer power available (i.e., direct numerical simulation of turbulence for a full-sized aircraft is both entirely unfeasible computationally, and provides drastically more information than is necessary for analysis). These are well-studied problems, and not "guesswork" as you seem to imply. Extending that to a situation where the inputs themselves are uncertain clearly is a situation where more computer power leads to more accurate results. Basic statistics tells you the more data points you have (in this case, results from a range of inputs), the more reliable your predictions of uncertainty.

Slashdot Top Deals

Understanding is always the understanding of a smaller problem in relation to a bigger problem. -- P.D. Ouspensky

Working...