Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×

Ray Tracing To Debut in DirectX 11 219

crazyeyes writes "This is breaking news. Microsoft has not only decided to support ray tracing in DirectX 11, but they will also be basing it on Intel's x86 ray-tracing technology and get this ... it will be out by the end of the year! In this article, we will examine what ray tracing is all about and why it would be superior to the current raster-based technology. As for performance, well, let Intel dazzle you with some numbers. Here's a quote from the article: 'You need not worry about your old raster-based DirectX 10 or older games or graphics cards. DirectX 11 will continue to support rasterization. It just includes support for ray-tracing as well. There will be two DirectX 11 modes, based on support by the application and the hardware.'"
This discussion has been archived. No new comments can be posted.

Ray Tracing To Debut in DirectX 11

Comments Filter:
  • by Bullseye_blam ( 589856 ) <.bullseye_1. .at. .yahoo.com.> on Monday March 31, 2008 @10:04AM (#22921230) Journal
    But I am really annoyed that April Fool's has now become a multi-day event.
  • by pembo13 ( 770295 ) on Monday March 31, 2008 @10:04AM (#22921234) Homepage
    If one's thing sure. Pity DirectX11 will work on so few platforms.
    • I haven't really been in the 3d-graphics-API scene for awhile, so I'm wondering what's available for OpenGL raytracing. There are a bunch of plugins etc for 3d-rendering that I remember, such as POVRay, etc, but how about realtime?

      Anyone know if there's anything available/in-the-works?
      • Re:OpenGL (Score:5, Informative)

        by Yetihehe ( 971185 ) on Monday March 31, 2008 @10:41AM (#22921678)
        There is now only OpenRT which have Open only fro similarity with OpenGL (it is fully proprietary implementation, but has API similar to that of OpenGL).
      • a) OpenGL is an immediate-mode API - it doesn't store a "scene" it just processes a single polygon at a time.

        b) You can't raytrace something unless you have access to the whole scene.

        QED.

    • Perhaps it'll work on X11
  • by Froze ( 398171 ) on Monday March 31, 2008 @10:05AM (#22921236)
    Or maybe just obvious to anyone in the industry. Since clock speeds are bounded and not getting any faster and you can only lower voltages so much before signals get lost in the noise, the only way forward is in parallelism and ray tracing is wondrously parallelifyable (is that a real word?).
    • Yeah, but you'll have to buy a new post-Vista operating system just to get this nifty new feature.
    • It's probably parallelable.

      Anyway, clock speeds aren't increasing much right now but I suspect that this is only a limitation of current tech. Someday I hope we can get clock speeds to reach much higher levels.. like say a frequency close to the composition of planck times [wikipedia.org] inherent in the circuitry of the CPU.

      • This raises a good question (triggered by "parallelable": if the hardware supports raytracing, is Apple planning to add raytracing support too? How does this mix with OpenGL? I'm assuming that DX11 isn't coming to WINE any time soon.
    • by the_humeister ( 922869 ) on Monday March 31, 2008 @10:24AM (#22921498)

      ...parallelifyable (is that a real word?).


      Yes, it's a very cromulent word.
    • I don't think ray tracing has any inherent advantage over rasterization when it comes to parallelism. Both techniques require that each rendering node have access to all of the data for the scene. Plenty of parallel rasterization hardware cards are on the market, the first one I used was the Voodoo2 SLI [wikipedia.org].

      Software rendering may come back into style with these faster CPUs, but I'm doubting we are going to see ray tracing gain any serious ground in real-time 3D rendering.
      • I remember a comment from a previous story on raytracing that basically said, since using raster processing is faster, to get the same quality image, you will always be able to get a better image out of a raster graphics processor, then from a ray-tracer. Which means that raytracing is nice if you have time to wait around, but if you wait around the same amount of time with raster processing, you'll get a better image.
        • by 7Prime ( 871679 )
          Define "better":

          I think the comparison on the website says it all. Rasterization is better in a static world, where the creators set every object in place before hand. Then they can raytrace it to produce the desired reflections, lighting effects, etc. and use those raytraced surfaces as maps in the real-time rasterized world. If that's the case, then a raytraced world isn't going to look much different.

          However, the advantage to virtual worlds over movies and television is that they aren't static, they're i
      • by Froze ( 398171 )
        The benefit is that ray tracing can generate better scenes by evaluating the physics of lighting in a more accurate way. The simple parallelism aspect comes into play when you have 32 GPU's that can render individual frames at rates higher than one per second each giving you more realistic animation and better lighting physics, not to mention the ability of the intra scene parallelism. As a reference the wikipedia article here has supporting corroboration http://en.wikipedia.org/wiki/Ray_tracing_hardware [wikipedia.org]
        • That may sound like a good idea, but what about the fact that Carmack says the stuff is a waste [pcper.com] (and isn't planning to program it in)....let alone we're talking Intel integrated graphics, which is not stuff that gamers use. So where is this magically supposed to make a difference other than break compliance?

          Personally, I'd have people pay more attention to textures and make things more efficient than concentrate on a new shiny method to improve shadows. Shadows are the first thing to go from non high end sy
          • by edwdig ( 47888 )
            Personally, I'd have people pay more attention to textures and make things more efficient than concentrate on a new shiny method to improve shadows.

            Shadows are actually the glaring weakness in ray tracing setups. Ray tracing tends to lead to shadows going instantly from completely solid to non-existant. You don't get the soft edges.

            Ray tracing tends to be great for reflections though.
          • Re: (Score:2, Insightful)

            by Hal_Porter ( 817932 )
            Maybe Carmack is wrong?

            Doom and Wolfenstein were clever back in the day, but it wouldn't be the first time that a famous expert was blindsided by a paradigm shift in their field.
    • Why do you assume advances won't let use move to higher clock speeds and lower voltages? I think the word you're looking for is "parallelizable."
    • Ray tracing has been around for decades, and it's been steadily getting faster. Even in the 1980's, it was clear that eventually it would be doable in real-time. And there have been real-time ray tracing demonstrations around for several years now.

      So, there is nothing "forward thinking" about this, Microsoft is simply following industry trends.
    • Surprisingly forward thinking on Intel's part, who are providing the engine, and have a business selling faster and faster multicore CPUs.
  • If this isn't an April Fools joke- maybe they could get DX-10 to work first, before worrying about DX-11?(!)
  • by LiquidCoooled ( 634315 ) on Monday March 31, 2008 @10:12AM (#22921322) Homepage Journal
    It says nvidia will be locked out because DirectX11 raytracing will be based on x86.
    Wasn't DirectX meant to be a generic middleman to allow developers to abstract away from the specific implementations?

    Isn't this a backwards step that basically cuts anyone developing for it out of using the code on other systems (and I am meaning even the xbox 360).
    • by GreggBz ( 777373 )
      Now, hopefully I'm explaining this right. I'm sure a developer will set me straight if it's wrong.

      DirectX works by talking right to the driver. Hence the name, DirectX. The hardware vendor is responsible for translating said DirectX function to operations in their hardware.

      It's good in that there's less overhead and the drivers can be optimized per the vendor. It's bad in that some of the features may or may not be supported in hardware, and you are at the mercy of the vendor.

      OpenGL, on the other hand, is a
  • Vista only, BTW. (Score:2, Insightful)

    by snarfies ( 115214 )
    DX10 is Vista-only. I'm going to guess DX11 will be the same. Which means I'll never see it in action, as I will switch to Linux before I switch to Vista.
    • by Hatta ( 162192 )
      Are there that many features in DirectX that aren't available in OpenGL? If the hardware can do it, why not make an OpenGL implementation of it too?
  • An interesting read on this very subject here [pcper.com]. Quote:

    "I have my own personal hobby horse in this race and have some fairly firm opinions on the way things are going right now. I think that ray tracing in the classical sense, of analytically intersecting rays with conventionally defined geometry, whether they be triangle meshes or higher order primitives, I'm not really bullish on that taking over for primary rendering tasks which is essentially what Intel is pushing."

    Carmack admits he has his own personal preference, but generally he's pretty sensible about these things. He's usually called it correctly in the past when people have pushed various technologies that were supposed to take over the world, and they've fallen by the wayside.

    Hopefully he'll chime into this latest article with some further thoughts.

    • That article was also discussed on Slashdot [slashdot.org].
    • by eebra82 ( 907996 )
      PC Perspective also features an article [pcper.com] written in January on the impact of ray tracing in games. It provides many pictures of what it will look like and what the benefits are vs rasterization. It's written by Daniel Pohl, a research scientist at Intel.
      • Re: (Score:3, Informative)

        by Azarael ( 896715 )
        There's this one from PC Perspective as well which is an interview with NVidia's Tech Director:
        http://www.pcper.com/article.php?aid=530/ [pcper.com]

        His view on ray tracing is pretty much summed up by:

        David Kirk, NVIDIA: I'm not sure which specific advantages you are referring to, but I can cover some common misconceptions that are promulgated by the CPU ray tracing community. Some folks make the argument that rasterization is inherently slower because you must process and attempt to draw every triangle (even invisible

  • ... will be ensured by using ray tracing to render characters in your word processing application! Finally, Vista will get some love.
  • Out by end of the year in MICROSOFT TIME means OUT BY 2011 - Q4. Maybe.
  • by argent ( 18001 ) <(peter) (at) (slashdot.2006.taronga.com)> on Monday March 31, 2008 @10:23AM (#22921476) Homepage Journal

    "I'll be interested in discussing a bigger question, though: 'When will hardware graphics pipelines become sufficiently programmable to efficiently implement ray tracing and other global illumination techniques?'. I believe that the answer is now, and more so from now on! As GPUs become increasingly programmable, the variety of algorithms that can be mapped onto the computing substrate of a GPU becomes ever broader.

    As part of this quest, I routinely ask artists and programmers at movie and special effects studios what features and flexibility they will need to do their rendering on GPUs, and they say that they could never render on hardware! What do they use now: crayons? Actually, they use hardware now, in the form of programmable general-purpose CPUs. I believe that the future convergence of realistic and real-time rendering lies in highly programmable special-purpose GPUs."
    Very interesting. A couple of years later he was arguing against special purpose GPUs for ray tracing, and for the use of "General Purpose GPUs", and the new nVidia 8xxx series seem to be following that path... away from dedicated rendering pipelines and towards a GPU that's more like a highly parallel CPU.

    More comments from David Kirk. [scarydevil.com]

    I would be very interested in what he learned between 2002 and 2004 that led him to argue so eloquently against Phillip Slusallek. I'd also like to know what Professor Slusallek is doing at nVidia, where he's "working with the research group on the future of realtime ray tracing" [linkedin.com].
  • by jmichaelg ( 148257 ) on Monday March 31, 2008 @10:23AM (#22921478) Journal
    Intel has this article [intel.com] about the hardware needed to run at 50fps at 1920x1080p. They're claiming you need 8 cores. In a couple of years, that could well be within reach for most gamers.

    There's also this John Carmack Interview [pcper.com]. Carmack isn't too optimistic about ray tracing replacing rasterized graphics.
    • SaarCOR [uni-sb.de] was getting about 10 FPS for Quake3 with a minimal FPGA-based implementation of a hardware raytracer running at less than 100 MHz with a fraction of the gate budget of a modern GPU... in 2005. Raytracing is highly scalable - it's an "embarrassingly parallelizable" problem - so if nVidia is really working on raytracing hardware they could well be able to beat Intel to the punch.
    • A number of people refer to raytracing as an "embarrassingly parallel" process. The implication in the term being that there's no need for communication between each core or thread or process: they each just get handed a rectangular portion of the offscreen screen memory, and they do their job alone, and when they're all done, then the screen can be flipped to show the results.

      I will grant that the actual rendering of pixels is indeed independent, but that's not the proposal. Nobody wants the same geom

      • by argent ( 18001 )
        The implication in the term being that there's no need for communication between each core or thread or process: they each just get handed a rectangular portion of the offscreen screen memory, and they do their job alone, and when they're all done, then the screen can be flipped to show the results.

        Well, yes, you could do that. You could also render your scenes by having the pipelines re-parse shader programs from text for every frame. But you don't.

        If access to the mesh is a bottleneck, give the pipelines
        • by cnettel ( 836611 )
          You still don't have data locality. A reflection can, and will, take you anywhere in the scene. Each core needs fast acces to the complete scene. As you scale this up, you have a memory bottleneck. You don't need inter-core communication, but you still need to access the same memory. Parallel reads are easier to handle than reads/writes, but really good parallelism also implies that the data distributed is limited. As memory bandwidth is a very real limitation in GPUs, that's not true here.
          • by argent ( 18001 )
            Each core needs fast acces to the complete scene.

            Well, yes, that's just the flip side of the fact that you don't need to do occlusion culling, which you might notice I mentioned. that is to say, I did not suggest splitting up the scene, but rather replicating it.

            What I suggested was giving the pipelines local memory and caching the static part of the mesh. Since that's almost all of the mesh, almost all of the time, they will only need to fetch dynamically changing meshes. With some analog for vertex buffer
  • It also obviates the need for the GPU which has stolen much of the limelight in recent years.

    So that $250 EVGA 8800GTS [newegg.com] I just bought soon will be used for a doorstop? I haven't even checked out the DX10 with it, I'm still kicking DX9. I may test out the vista x64 ultimate and see how crysis runs there as opposed to xp, which I doubt will be that dramatic. I somehow don't see gpus disappearing when dx11 premieres since not many people will actually have 8 or 16 core cpus.

    As DirectX 11 is a work in progress,

    • by Ancil ( 622971 )

      So that $250 EVGA 8800GTS I just bought soon will be used for a doorstop?

      Count yourself lucky. I already have a Radeon 9700 Pro propping my office door open, so that's out.

      And don't even mention the word "paperweight". A pair of SLI Voodoo2's are filling that role nicely.
  • by Thanshin ( 1188877 ) on Monday March 31, 2008 @10:32AM (#22921584)
    Raytracing allows the implementation of mirrors in 3d environments.

    Finally all business software will have the feature of showing the cause of most problems. (See also "Error Id: 10T" and PEBKAC)
    • Duke Nukem 3d had mirrors. Yet another example of how far ahead of its time it was.

      Perhaps Duke Nukem Forever is going to be ray-traced.

  • DX11, like DX10, will probably be Vista-only. So, will Intel build OpenGL support, roll their own API, or tie the success or failure of their graphics architecture to Vista?
  • povray (www.povray.org) won't be outdated anytime soonish, I guess. Today there is more than raytracing to it, like light scattering effects etc. Still, if those additional effects are done in hardware too, povray and other renderers may face an uphill battle. Like within just a few years.
    • if those additional effects are done in hardware too, povray and other renderers may face an uphill battle
      What do you mean battle? They will happily use this hardware effects for faster rendering!
      • "What do you mean battle? They will happily use this hardware effects for faster rendering!"

        How is that? Didn't realize that!

        From: http://tag.povray.org/povQandT/miscQandT.html [povray.org]

        _ _

        "Will POV-Ray render faster if I buy the latest and fastest 3D videocard?"

        No.

        3D-cards are not designed for raytracing. They read polygon meshes and then scanline-render them. Scanline rendering has very little, if anything, to do with raytracing. 3D-cards can't calculate typical features of raytracing as reflections etc. Th
        • I suppose if POV-Ray would freeze at it's current stage in development, sure, you'd be right.

          That entry in the FAQ comes from an era when those plucky Voodoo cards started coming out. If there ever comes a 3D card that supports the types of raytracing calculations POV-Ray needs, and if it represented a real chance to offload some work from the CPU with measurable time savings, and it worked cross-platform, they'd add support for it.
  • Don't be unfair, is not Microsoft intentionally delivering what they promise far later, is that they measure time in an exponential curve while we measure it in a linear one, so the last month for them of this wait will take several of ours (if happens in our lifetime, at least).
  • by SomeoneGotMyNick ( 200685 ) on Monday March 31, 2008 @10:49AM (#22921786) Journal
    I'll hold on to Imagine [wikipedia.org] for my Amiga until it's pried from my cold, dead hands.

    34.2 minutes per rendered frame gives me plenty of time to do other things around the house.

    Actually, I would have mentioned Turbo Silver instead if there were any good links for it.
    • Totally off-topic, but is there a FOSS equivalent of the Amiga's VistaPro software for terrain rendering? I loved playing with that program back in the day and I'd love to see it on something faster than a 68030@25MHz.

  • Instead of a reply buried in the RT vs. Raster debate that this article generated, I thought a reply to the entire thread would be more appropriate: WHHHOOOOOOOOOSSSSHHHHHHHHH.... As the joke flew over your head. It's an early post April Fools bit, people. However, it might serve some of us to step back and examine our need to defend our own prejudices... {nah... what am I thinking... this is Slashdot..... Carry on.}
  • by Cathoderoytube ( 1088737 ) on Monday March 31, 2008 @11:39AM (#22922356)
    I thought PC gaming was in the throes of death. Fortunately now PC game developers will be able to use Ray Tracing instead of implementing the much ballyhooed 'fun' that graphically inferior console games seem to be touting.
  • At first I parsed the headline as "Raytracing in Direct X11". Would be amusing to see, at least.
  • by Jackie_Chan_Fan ( 730745 ) on Monday March 31, 2008 @12:29PM (#22922930)
    All of this talk about raytracing, and we still do not have high quality anti aliased renders with existing real time rendering methods.

    Games still look like shit. NONE of them can even compare to the nice anti aliased images generated by software renderers.

    Anti Aliasing is all fine and dandy, but when a game looks like shit, these days its due to anti aliasing. We can do plenty of visually stunning things in realitime but no matter what you do, it still looks like a video game because the damn hardware cant render high resolution enough with high quality anti aliasing enabled.

    How in the hell will raytracing solve that? :) It's just going to eat more pixel ponies for breakfast, lunch and dinner.

    Look at Gran Turismo on PS3. All of their PR videos have anti aliasing enabled and the game looks photoreal. However the reality is they're lying in their screenshots. The game itself does not use anti aliasing, thus making it look like a videogame. With Anti aliasing enabled, its photoreal, without, it looks like shit.

    This is an old problem, which the hardware companies have addressed... they just cant deliver on performance.

    But they can on raytracing? No thanks. Anti aliasing in ray tracing renderers is even slower. I dont care how accurate the reflects are, if its aliased to shit... it will never look convincing.

    • by Grym ( 725290 ) *

      As I understand it, aliasing in it's most obvious form, "jaggies," [wikipedia.org] is more of a problem in lower resolution renderings than higher resolution ones, simply because each pixel represents less and less of a percent of the overall scene.

      From that, one could infer that if ray-tracing were to take off and dramatically improve performance on large resolution renderings (because of parallelization), the problem of aliasing would eventually solve itself or, at the very least, be highly outweighed by the awesomenes

  • It's already April 1st in some parts of the world [timeanddate.com].
  • Not seeing what Microsoft has to do with X11 POV-Ray [povray.org]. been using it for like a decade now.
  • Can this mean finally we don't have to awkwardly try to construct things out of polygons which are more naturally described as functions? Down with polygons!

Solutions are obvious if one only has the optical power to observe them over the horizon. -- K.A. Arsdall

Working...