While the comments here are mostly negative, I can say this is a big leap ahead for rendering technology mainly because the rendering is occuring at the hardware level, rendered on the Nvidia processors on a video card, instead of the CPU via software rendering. They are calling this iray and it's developed by mental images, not nvidia. While video cards are currently great at rendering games in real time, they require a tremendous amount of shader programming and only do this sort of rendering within the context of a game, instead of within a CAD application. They are also limited in their ability to render GI, area (soft) shadows and refraction/caustics. By passing the rendering from a CAD app to iray and onto the videocard hardware, you have access to 200 parallel processors, instead of the 2, 4, or 6 processors on the CPU. So in theory a 3dsmax/Maya scene that takes you 5 hours (300 minutes) to render on a dual core CPU will only take 3 minutes with your videocard's processors. With the use of reality server (and enough multiple nvidia cards all rendering the same frame), the 3 minutes drops down to 3 seconds. Personally I'd settle for the 3 minutes and I'd be damned happy about it.