NASA Achieves Breakthrough Black Hole Simulation 281
DoctorBit writes "NASA scientists have achieved a breakthrough in simulating the merging of two same-size non-spinning black holes based on a new translation of Einstein's general relativity equations. The scientists accomplished the feat by using some brand-new tensor calculus translations on the Linux-running, 10,240 Itanium processor SGI Altix Columbia supercomputer. These are reportedly the largest astrophysical calculations ever performed on a NASA supercomputer. According to NASA's Chief Scientist, "Now when we observe a black hole merger with LIGO or LISA, we can test Einstein's theory and see whether or not he was right.""
I know you're being funny but (Score:1, Informative)
You wouldn't use a semi truck in a NASCAR race, and you wouldn't use a NASCAR vehicle to haul large boxes. They just aren't comparable.
Re:Are there non-spinning black holes? (Score:5, Informative)
There are two kinds of physicist (Score:2, Informative)
There are theoreticians. Einstein was a theoretician. He asked relatively simple questions and followed the logical consequences. I suspect that having to use a computer would have been a giant distraction and might have delayed or prevented the theory of relativity.
Re:Are there non-spinning black holes? (Score:3, Informative)
Re:and what? (Score:1, Informative)
Wikipedia link (Score:1, Informative)
Re:Wasted funding? (Score:3, Informative)
So if this experiment shows us that Einstein was right about gravitational waves, and those waves can tell us so much about the universe, I wouldn't call it a waste of money. Of course now we have to go through the trouble of actually detecting the bastards...
Re:Are there non-spinning black holes? (Score:5, Informative)
Re:Equations too complex? (Score:5, Informative)
the propagation of
describes the time evolution of a tensor for which all the
components are not independent- for instance they obey
Bianchi identities.
http://mathworld.wolfram.com/BianchiIdentities.ht
Simple numerical integrators destroy these identities
at order dt^n for some small but finite n. Run the code
forwards and one can find finite time blow ups due to
the stepping algorithm- however even after a single
time step the numerical solution has unphysical aspects
Finding
http://www.ima.umn.edu/nr/abstracts/6-24abs.html [umn.edu]
Meh (Score:4, Informative)
Re:Are they really testing what they think? (Score:5, Informative)
First, with regard to intel, there is essentially no risk from this, as the math libraries used by everyone involved in such work wave test exercises that verify the accuracy of the hardware. It's not uncommon to run every calculation on two physical processors to assure that no single processor malfunction can introduce a significant error.
Second, with regards to the correct approximation of Einsteins equations, either the approximation is exact, in which case there is no risk, or the error size for the approximation is closely known, in which case when we observe the black hole merger, we will have one of 3 conditions: confident to some error size that he was right (actual results match simulation, but we can't rule out his theory being slightly wrong at a finer level), confident that he was wrong (actual results lie outside of error range for simulation), or no result (actual results indicate the possibility he was wrong, but lie within error range).
Stats.. (Score:2, Informative)
In case anyone was wondering how Columbia stacks against their rig, check out:
http://www.top500.org/ [top500.org]
Here's the November 2005 list:
http://www.top500.org/lists/2005/11/TOP10_Nov2005. pdf [top500.org]
It shows Columbia with:
51.87 Rmax (teraflops/second).. It also states that it moved from #3 ranking to #4.
A bit more detail (Score:1, Informative)
The issue is that there are 10 equations (for the metric tensor of space time), four of which are 4 constraints (conservation laws). You step all 10 equations then check to see that your 4 constraints are still satisfied (which they will be in the continuous case, but you discretized), and they are more or less, but not exactly. As you move forward the error grows hugely. The equations only specify what should happen when the constraints are satisfied. Once you get off, all bets are off and you need to start making up ad-hoc procedures to get them back or better your initial step to keep them from getting off in the first place.
The equations are conservative (no energy loss), extreemely non-linear and complicated. It's akin to simulating EM, except the equations are non linear.
Get the paper here (Score:1, Informative)
Re:Are there non-spinning black holes? (Score:3, Informative)
Once they have some experience with this simulator I'm sure they will move on to spinning black holes.
True. In fact, some steps have already been taken in this direction by other groups. For instance, my group at U.T. Brownsville -- whose non-spinning simulations were published simultaneously with the NASA results (but we don't have the same PR machine) -- have put up a preprint on the orbits of black-hole binaries where the individual holes have spins parallel to (or antiparallel to) the orbital angular momentum. Check it out here:
http://aps.arxiv.org/abs/gr-qc/0604012 [arxiv.org]
Basically, right now it seems like adding spins doesn't make the simulations much more difficult per se, but it -does- mean they might take much longer to run: the greater the total angular momentum in the system, the longer the holes will orbit each other before merger, since they need to get rid of more excess angular momentum.
From a member of the group (Score:5, Informative)
1) This is a first -- no other group has achieved this before. yay! (after decades of work!)
2) This is hard for the following reasons:
a) since you are doing calculations near (or on/in) a black hole, you tend to get a lot of
infinities, which 1) crash your code and 2) exacerbate your errors
b) for most simulations, your grid remains fixed. For black holes though, they *deform* the
spacetime around them -- which means your grid points have to move (in a non-predictable
manner)!
c) what happens when two black holes merge is not well understood (ie, what should happen?),
so this is new science
d) initial data is hard to get and unreliable. If two black holes are far apart, you can
write an exact solution (at least within some error), but to get them close to where they
are interating, you pretty much need this kind of simulation anyways. This is such a large
problem that there are only a handful (a dozen or two?) initial data sets currently.
3) Everything is written in Fortran!
4) It runs on a variety of architectures (x86, Itanium, PA-RISC, Alpha, etc etc)...pretty much
anything that supports ifc (faster) or gcc.
5) There are several approaches to some of the issues above, from puncture splitting (using a
different spacetime metric like 1/r vs r to remove the singularity), excision (not evolving
inside the event horizon, since that's not "interesting" anyways), and other methods. Our
new method actually doesn't need any of those "tricks", which is pretty interesting.
6) This data helps drive the LISA and LIGO projects from a theoretical standpoint--basically
knowing what kind of gravitional waves they should be seeing, and to correlate what they see
and what their data may represent (ie, if you see a waveform like this, this means that it's
two merging black holes, vs just co-rotating black holes).
6a) We study black holes b/c they are pretty much the only thing that'll generate detectable
gravitational waves.
so yay!