Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

NASA Achieves Breakthrough Black Hole Simulation 281

DoctorBit writes "NASA scientists have achieved a breakthrough in simulating the merging of two same-size non-spinning black holes based on a new translation of Einstein's general relativity equations. The scientists accomplished the feat by using some brand-new tensor calculus translations on the Linux-running, 10,240 Itanium processor SGI Altix Columbia supercomputer. These are reportedly the largest astrophysical calculations ever performed on a NASA supercomputer. According to NASA's Chief Scientist, "Now when we observe a black hole merger with LIGO or LISA, we can test Einstein's theory and see whether or not he was right.""
This discussion has been archived. No new comments can be posted.

NASA Achieves Breakthrough Black Hole Simulation

Comments Filter:
  • by Anonymous Coward on Wednesday April 19, 2006 @11:38AM (#15157588)
    This wouldn't be at all comparable to a home machine designed to play HL2.

    You wouldn't use a semi truck in a NASCAR race, and you wouldn't use a NASCAR vehicle to haul large boxes. They just aren't comparable.
  • by hunterx11 ( 778171 ) <hunterx11NO@SPAMgmail.com> on Wednesday April 19, 2006 @11:40AM (#15157615) Homepage Journal
    "Rotating black holes are thought to be formed in the gravitational collapse of a massive rotating star or from the collapse of a collection of stars with an average non-zero angular momentum. Most stars rotate and therefore it is expected that most black holes in nature are rotating black holes." Rotating black hole - Wikipedia [wikipedia.org]
  • by Anonymous Coward on Wednesday April 19, 2006 @11:43AM (#15157645)
    There are experimenters. The guys who ran the simulation were experimenters.
    There are theoreticians. Einstein was a theoretician. He asked relatively simple questions and followed the logical consequences. I suspect that having to use a computer would have been a giant distraction and might have delayed or prevented the theory of relativity.
  • by FuzzyDaddy ( 584528 ) on Wednesday April 19, 2006 @11:57AM (#15157751) Journal
    The reason for doing a non-spinning black hole is that it's an easier calculation to make. Once they have some experience with this simulator I'm sure they will move on to spinning black holes.

  • Re:and what? (Score:1, Informative)

    by Anonymous Coward on Wednesday April 19, 2006 @11:58AM (#15157769)
    The validation comes after the prediction: the huge laser arrays looking for gravity waves are searching for such. This simulation defines what the most massive of gravity waves, thus most detectable, should look like.
  • Wikipedia link (Score:1, Informative)

    by Anonymous Coward on Wednesday April 19, 2006 @12:00PM (#15157790)
  • Re:Wasted funding? (Score:3, Informative)

    by Omniscientist ( 806841 ) <matt.badecho@com> on Wednesday April 19, 2006 @12:11PM (#15157880) Homepage
    If this experiment can ultimately lead us to see if Einstein was right about gravitational waves or not, then this is not a waste of funding. Because these waves are thought to be unchanged by any material they happen to pass through, it is thought that they may carry unaltered signals across various reaches of space. This could theoretically provide us with a way to estimate cosmological distances and help us understand how the universe was formed, what the whole of it looks like, and the ultimate fate of the universe.

    So if this experiment shows us that Einstein was right about gravitational waves, and those waves can tell us so much about the universe, I wouldn't call it a waste of money. Of course now we have to go through the trouble of actually detecting the bastards...

  • by loudambiance ( 935806 ) on Wednesday April 19, 2006 @12:13PM (#15157900)
    According to theory, the event horizon of a black hole that is not spinning is spherical, and its singularity is (informally speaking) a single point. If the black hole carries angular momentum (inherited from a star that is spinning at the time of its collapse), it begins to drag space-time surrounding the event horizon in an effect known as frame-dragging. This spinning area surrounding the event horizon is called the ergosphere and has an ellipsoidal shape. Since the ergosphere is located outside the event horizon, objects can exist within the ergosphere without falling into the hole. However, because space-time itself is moving in the ergosphere, it is impossible for objects to remain in a fixed position. Objects grazing the ergosphere could in some circumstances be catapulted outwards at great speed, extracting energy (and angular momentum) from the hole, hence the name ergosphere ("sphere of work") because it is capable of doing work. Once all the angular momentum is extracted from a spinning black hole, what do you think happens, it stops spinning.
  • by augustm ( 147506 ) on Wednesday April 19, 2006 @12:19PM (#15157965)
    A major technical problem of integrating field equations is in
    the propagation of /constraints/ on the components. Ie GR
    describes the time evolution of a tensor for which all the
    components are not independent- for instance they obey
    Bianchi identities.
    http://mathworld.wolfram.com/BianchiIdentities.htm l [wolfram.com]


    Simple numerical integrators destroy these identities
    at order dt^n for some small but finite n. Run the code
    forwards and one can find finite time blow ups due to
    the stepping algorithm- however even after a single
    time step the numerical solution has unphysical aspects


    Finding /constraint conserving/ algorithms is tricky
    http://www.ima.umn.edu/nr/abstracts/6-24abs.html [umn.edu]

  • Meh (Score:4, Informative)

    by ichigo 2.0 ( 900288 ) on Wednesday April 19, 2006 @12:27PM (#15158044)
    HL2 is singlethreaded so the performance would be the same as on one Itanium. Also x86 code has to be emulated on Itaniums = slow. Oh and no GPU which means pixel/vertex shaders would have to run on software. Educated guess: 0.1 fps.
  • by Surt ( 22457 ) on Wednesday April 19, 2006 @12:28PM (#15158052) Homepage Journal
    This is really not the case.

    First, with regard to intel, there is essentially no risk from this, as the math libraries used by everyone involved in such work wave test exercises that verify the accuracy of the hardware. It's not uncommon to run every calculation on two physical processors to assure that no single processor malfunction can introduce a significant error.

    Second, with regards to the correct approximation of Einsteins equations, either the approximation is exact, in which case there is no risk, or the error size for the approximation is closely known, in which case when we observe the black hole merger, we will have one of 3 conditions: confident to some error size that he was right (actual results match simulation, but we can't rule out his theory being slightly wrong at a finer level), confident that he was wrong (actual results lie outside of error range for simulation), or no result (actual results indicate the possibility he was wrong, but lie within error range).
  • Stats.. (Score:2, Informative)

    by modi123 ( 750470 ) on Wednesday April 19, 2006 @12:44PM (#15158200) Homepage Journal

    In case anyone was wondering how Columbia stacks against their rig, check out:

    http://www.top500.org/ [top500.org]

    Here's the November 2005 list:

    http://www.top500.org/lists/2005/11/TOP10_Nov2005. pdf [top500.org]

    It shows Columbia with:

    51.87 Rmax (teraflops/second).. It also states that it moved from #3 ranking to #4.

  • A bit more detail (Score:1, Informative)

    by Anonymous Coward on Wednesday April 19, 2006 @12:49PM (#15158231)
    Caltech (Kip Thorne) has an NSF grant (the largest ever) to detect gravitational waves in order to confirm GR. The expected source of these the collision and eventual merging of two black holes. The problem is that they couldn't simulate the process they were going to detect in a computer or even a single balck hole by itself for even a 10000 or so steps with diverging. Two merging black holes were out of the question.

    The issue is that there are 10 equations (for the metric tensor of space time), four of which are 4 constraints (conservation laws). You step all 10 equations then check to see that your 4 constraints are still satisfied (which they will be in the continuous case, but you discretized), and they are more or less, but not exactly. As you move forward the error grows hugely. The equations only specify what should happen when the constraints are satisfied. Once you get off, all bets are off and you need to start making up ad-hoc procedures to get them back or better your initial step to keep them from getting off in the first place.

    The equations are conservative (no energy loss), extreemely non-linear and complicated. It's akin to simulating EM, except the equations are non linear.
  • Get the paper here (Score:1, Informative)

    by Anonymous Coward on Wednesday April 19, 2006 @01:24PM (#15158549)
    I think this [arxiv.org] is the paper which summarizes the results discussed in the article. If so, the "formulation" alluded to in the article is the conformal BSSN formulation; more details of their method here [arxiv.org], and the BS of BSSN (Baumgarte-Shapiro) paper here [arxiv.org] (the SN paper, Shibata-Nakamura, isn't online).
  • by beanyk ( 230597 ) on Wednesday April 19, 2006 @02:37PM (#15159206)

    Once they have some experience with this simulator I'm sure they will move on to spinning black holes.


    True. In fact, some steps have already been taken in this direction by other groups. For instance, my group at U.T. Brownsville -- whose non-spinning simulations were published simultaneously with the NASA results (but we don't have the same PR machine) -- have put up a preprint on the orbits of black-hole binaries where the individual holes have spins parallel to (or antiparallel to) the orbital angular momentum. Check it out here:

    http://aps.arxiv.org/abs/gr-qc/0604012 [arxiv.org]

    Basically, right now it seems like adding spins doesn't make the simulations much more difficult per se, but it -does- mean they might take much longer to run: the greater the total angular momentum in the system, the longer the holes will orbit each other before merger, since they need to get rid of more excess angular momentum.
  • by ChenLing ( 20932 ) <slashdot&ilovedancing,org> on Wednesday April 19, 2006 @04:03PM (#15160015) Homepage
    I'm a recent member of this group, so I'd like to put in my 2 cents.

    1) This is a first -- no other group has achieved this before. yay! (after decades of work!)

    2) This is hard for the following reasons:
        a) since you are doing calculations near (or on/in) a black hole, you tend to get a lot of
              infinities, which 1) crash your code and 2) exacerbate your errors
        b) for most simulations, your grid remains fixed. For black holes though, they *deform* the
            spacetime around them -- which means your grid points have to move (in a non-predictable
            manner)!
        c) what happens when two black holes merge is not well understood (ie, what should happen?),
            so this is new science
        d) initial data is hard to get and unreliable. If two black holes are far apart, you can
            write an exact solution (at least within some error), but to get them close to where they
            are interating, you pretty much need this kind of simulation anyways. This is such a large
            problem that there are only a handful (a dozen or two?) initial data sets currently.

    3) Everything is written in Fortran! :) (some competing groups use Cactus which is C++ based, although it also allows C and Fortran).

    4) It runs on a variety of architectures (x86, Itanium, PA-RISC, Alpha, etc etc)...pretty much
    anything that supports ifc (faster) or gcc.

    5) There are several approaches to some of the issues above, from puncture splitting (using a
    different spacetime metric like 1/r vs r to remove the singularity), excision (not evolving
    inside the event horizon, since that's not "interesting" anyways), and other methods. Our
    new method actually doesn't need any of those "tricks", which is pretty interesting.

    6) This data helps drive the LISA and LIGO projects from a theoretical standpoint--basically
    knowing what kind of gravitional waves they should be seeing, and to correlate what they see
    and what their data may represent (ie, if you see a waveform like this, this means that it's
    two merging black holes, vs just co-rotating black holes).
    6a) We study black holes b/c they are pretty much the only thing that'll generate detectable
    gravitational waves.

    so yay!

UNIX is hot. It's more than hot. It's steaming. It's quicksilver lightning with a laserbeam kicker. -- Michael Jay Tucker

Working...