Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Graphics Software

Blender Adds Raytracing 233

rastachops writes "Blender, the Open Source 3D modelling tool has recently added Raytracing to its extensive list of features. 'Believe it or not, but Ton has integrated the raytracer from Blender's predecessor, Traces into Blender. He said "the algorithm has been optimized and is now ten times faster. Combine that with a PC that's forty times faster than in the early 1990's and raytracing is almost usable". For a comparison checkout the before and after screenshots.'"
This discussion has been archived. No new comments can be posted.

Blender Adds Raytracing

Comments Filter:
  • by shfted! ( 600189 ) on Saturday December 13, 2003 @08:47AM (#7709938) Journal
    There's a much easier way to Ray-trace that involves very little processing time: just shoot Ray in the street and the cops will come trace him with chalk.
  • by g_braad ( 105535 ) on Saturday December 13, 2003 @08:48AM (#7709940) Homepage
    Hopefully they add even more usable features to it, like a decent shader... and a better user interface. I still prefer Maya for my overall work, but if Blender is evolving in this same pace. It is perhaps someday possible to switch to a cheaper solution
    • speaking of a decent shader --

      What has happened to BMRT (Blue Moon Render Toolbox, I believe is the full name)? It looks like not only has development stopped, but all trace of it has vanished. What gives?

      Let's pull a "Blender" and make sure it gets OSed!

    • Oren-Nayar, Blinn, Phong, and Toon shaders were added to Blender in version 2.28. There was a massive UI overhaul for version 2.30 (read the release notes at blender3d.org [blender3d.org]) and work is still continuing on this front. There is also work going on to integrate yafray [yafray.org] (a global illumination renderer, under fierce and rapid development) seamlessly into Blender, too.

      Cheers
  • Hah... (Score:5, Funny)

    by The-Bus ( 138060 ) on Saturday December 13, 2003 @08:49AM (#7709946)
    "raytracing is almost usable"


    Well, at least we've got someone that is being truthful about their software's feature set... A bit too refreshing, if you ask me.
    • Ray tracing is nice for quick-and-dirty demos. Not everyone can run a resource-intensive program like Maya. I remember we used PovRAY a lot in my computer graphics class. Also, weren't Wolfenstein and Doom based on Ray Tracing?
      • Re:Hah... (Score:3, Informative)

        by Lars T. ( 470328 )
        Nope, you mean ray casting (apart from POV-Ray). Look here. [permadi.com]
  • by KRYnosemg33 ( 709857 ) on Saturday December 13, 2003 @08:49AM (#7709949)
    they still have a long way
    Compare: www.whitehouse.gov [whitehouse.gov]
    &
    blender.org [blender.org]
    Judge for yourself :)
  • What's the use? (Score:2, Interesting)

    by Anonymous Coward
    Can anyone explain what the point of doing raytracing is over quicker better methods?. Raytracing uses inordinate amounts of time and processing wasting CPU cycles, to do what is in effect just an emulation of what a human eye or camera might see. The speed advantage of proper pixel shaders can't be ignored I think, it's several orders of magnitude quicker and to me doesn't look any different
    • Re:What's the use? (Score:5, Informative)

      by shamilton ( 619422 ) * on Saturday December 13, 2003 @09:09AM (#7710004)
      Raytracing is *the* elegant solution. Rasterisers use smoke and mirrors to achieve the same effects. Often those tricks are not flawless -- for example, you often see a smoke or explosion texture intersecting with nearby walls, creating an ugly edge. This is not the kind of thing I would want to see in a production movie, but in a game, it's not so rough.
      • Re:What's the use? (Score:5, Informative)

        by Spy Hunter ( 317220 ) on Saturday December 13, 2003 @10:29AM (#7710200) Journal
        Raytracing is not THE elegant solution. It is one of many methods of rendering. It is very hard to simulate realistic global ligthing effects using raytracing (soft shadows and light reflecting indirectly off of other surfaces in addition to light coming directly from a light source). For example, take this [virgin.net] image. You might think that's a fine raytraced image. But then compare it to this [virgin.net] image, produced with radiosity. I think you'll agree that the second image looks *much* more real. Note the subtle shading across the back wall and ceiling, and also the way it is a little bit darker where the walls and ceiling meet at 90 degree angles. Effects like that would be nearly impossible to reproduce with raytracing, and don't even think about real-time rendering. Pictures like this really show how far real-time rendering has to go before it actually looks like reality.

        Radiosity isn't *the* solution to rendering either. There are a whole range of lighting effects we see in daily life, and even radiosity only simulates some of them. For example, caustics (the funny patterns of light on the bottom of the pool). Even more general approaches to simulating light are being researched, but I don't really know if any of them are in use commercially yet.

        Also, in case you were wondering, Quake/Unreal/etc actually use radiosity rendering as part of the map making process, then store the results in "light maps" which are basically textures that control how light or dark a wall is instead of its color. Pre-computing the lighting allows real-time rendering of nice levels with radiosity effects, but it has several problems. Firstly, light maps take up a lot of memory (there's one for each wall, while most other textures are used on more than one wall and are tiled repeatedly), so they are stored at a pretty low resolution to minimize memory usage. This produces a blocky "stair-step shadow" effect that you've probably seen if you've played Counter-Strike. Secondly, since all the lighting is pre-computed, you can't change it easily. If you want a light to turn on and off, you have to store two light maps for every wall affected by that light: one with it on and one with it off. This is why in most games where you can shoot out lights, there are only a select few that you can shoot out. This approach has even more trouble with moving objects or moving lights (flashlights, car headlights, explosions, muzzle flashes). Real-time OpenGL or DirectX style lighting is usually used for these types of lights and moving objects, but then you don't get the nice shadows and other lighting effects that radiosity gives you.

        • Re:What's the use? (Score:5, Interesting)

          by adrianbaugh ( 696007 ) on Saturday December 13, 2003 @11:09AM (#7710387) Homepage Journal
          Your raytracing example appears to be using a point light source and not modelling any atmospheric effects, so naturally it looks rather basic. If you use a proper extended light source and set up your raytracer to model the atmosphere properly then it will look much better. If your raytracing program can't do that it's a limitation of that particular program, not raytracing in general.
          Of course, proper accurate raytracing as I've described will take a /lot/ more CPU time...
          • Re:What's the use? (Score:3, Informative)

            by olethrosdc ( 584207 )
            Perhaps, but what radiosity achieves is to model the diffracted light rays from other surfaces. For example, suppose you have a red-coloured surface next to a white one. The red colour normally spills on to the white surface because some of the light diffracted from the red surface reaches the white one. This is quite hard to model accurately with ray-tracing because for every point that you trace back to a surface you must not only calculate the effect of the light coming directly from the light source it
            • Re:What's the use? (Score:4, Informative)

              by hawkstone ( 233083 ) on Saturday December 13, 2003 @12:15PM (#7710759)
              So, it can't work.

              Sure it can! Raytracing can absolutely model diffuse interreflections. However, while radiosity is an analytic (though approximated through a mesh) solution, raytracing typically uses a Monte Carlo sampling technique to achieve this. You can imagine how painfully slow this is, but it works just fine. (And it doesn't require meshing your objects beforehand, either.)

              • How many samples do you usually have to take to make a good approximation? Imagine the case where you have a highly reflective small object, like for example a watch that shines reflected light onto a small circle somewhere far away. Hm.. I guess one could just sample a few times only on all objects, but in a complex scene something like that can be completely lost.

                Also, it all must get quite a bit more complicated when the medium itself is diffracting. In this case you'd need to sample on all possible dir
                • Re:What's the use? (Score:4, Insightful)

                  by hawkstone ( 233083 ) on Saturday December 13, 2003 @02:12PM (#7711347)
                  Oh, sure it's painful! Extremely, tremendously painful. But mathematically it all works out.

                  Take caustics, for instance, like a magnifying glass focusing a light source onto a small point on a surface. This is, and has been, done using raytracing.

                  I've not actually implemented it, but I'm slightly familiar with the techniques. I imagine some intelligent sampling of incident rays, and maybe adaptive supersampling of these rays, would help a lot with the phenomenal costs.

                  Photon maps are another solution, and (if I remember correctly), they implement forward ray tracing instead of the usual backward ones. Since you can then cast rays from the light source outward, this can be much, much cheaper.
          • The raytraced image could be made a little nicer by using soft shadows. But the kinds of things that the radiosity image is doing really are impossible to simulate with raytracing. You can't get the kind of thing that shows on the ceiling of that image, or the windowsills, using raytracing. The problem isn't a lack of "atmospheric effects," the problem is that to accurately model light, you have to take into account the fact that lights are not the only light sources in a scene; surfaces are lit by other
        • Re:What's the use? (Score:5, Informative)

          by Animaether ( 411575 ) on Saturday December 13, 2003 @11:20AM (#7710442) Journal
          Even more general approaches to simulating light are being researched, but I don't really know if any of them are in use commercially yet.

          We make a commercial renderer ( Brazil r/s ), and I can safely say "Yes" to that one.

          Most of the commercial renderers, either specific or coming with an application, support global illumination through the use of Quasi-Monte Carlo sampling.
          It, in essence, does calculate everything accurately - as long as you set the scene up as such.
          It's also stupid-slow :)

          That's why there's Photon maps, Irradiance Mapping, metropolis light transport, and even more simple constructions such as highly optimized skylighting, arealights/shadows and so forth and so on.
        • For instance, the next step after radiosity, as I understand, is Metropolis Light Transport. It can render some rather nice caustics:

          http://graphics.stanford.edu/papers/metro/fig6.jpg [stanford.edu]

          http://graphics.stanford.edu/papers/metro/ [stanford.edu]
        • I took a look at the radiosity tutorial [virgin.net] from which those pictures come, and it's really well written. It's clear that raytracing is better than nothing, but that other techniques will yield even more realistic images.
        • Look at this: http://web.axelero.hu/kcsi/ [axelero.hu].

          Szo

        • For example, take this image. You might think that's a fine raytraced image.

          That's about the lousiest raytraced image I've ever seen. Now this is a fine raytraced image [povray.org]!
      • Re:What's the use? (Score:2, Insightful)

        by PixelSlut ( 620954 )
        Raytracing is *the* elegant solution. Rasterisers use smoke and mirrors to achieve the same effects. Often those tricks are not flawless -- for example, you often see a smoke or explosion texture intersecting with nearby walls, creating an ugly edge. This is not the kind of thing I would want to see in a production movie, but in a game, it's not so rough.

        Keep in mind that all the Pixar movies use rasterization techniques, not raytracing or radiosity. The reason you see those problems you stated above h

        • Re:What's the use? (Score:3, Informative)

          by kbielefe ( 606566 )
          Keep in mind that all the Pixar movies use rasterization techniques, not raytracing or radiosity.
          Raytracing is prominent in the renderman feature list [pixar.com] as being available since release 11 and used when the shot merits it. I'd be very surprised if raytracing wasn't used in "Finding Nemo" at least.
          • Re:What's the use? (Score:2, Interesting)

            by rascal1182 ( 729393 )
            IIRC, not a lick of ray tracing has was used in "Finding Nemo." Only a handful of scenes in all of the Pixar films have employed ray tracing. It just takes too long to get an image that's good enough. On the other hand, Blue Sky's "Ice Age" was entirely ray traced.

            The REYES algorithm (the scanline algorithm that has been the "backbone" of Renderman since its creation) is incredibly fast, but with great results. A G5 at Pixar's booth at the SIGGRAPH conference last summer was rendering a frame of "Fi
  • Ewwwww (Score:2, Funny)

    by r00zky ( 622648 )
    that before and after shoots look like my sister
    *bleah*
  • Comparison (Score:3, Insightful)

    by Psychic Burrito ( 611532 ) on Saturday December 13, 2003 @09:06AM (#7709990)
    Hmm... wouldn't it be much easier to compare the two renderers if the rendered the same picture, the same size, and with the same lighting?
  • OK, this is cool, now for a mirror of the screenshots...
  • by Anonymous Coward on Saturday December 13, 2003 @09:31AM (#7710055)
    Bite my shiny metal ass.

    Oh, you said Blender...
  • So if someone can explain it to me, what's the difference? I actually thought the 'before' image looked more realistic.
    • if you raytrace you actually follow the true path that light would take (or reversed, I am not sure which they use), it is pretty much the best method of rendering realistic objects. this is just the first step towards a very powerful renderer and...*big eyes* Caustics?!?!
      • (or reversed, I am not sure which they use)

        I think it has to be done backwards, or you end up shooting rays all over the place, 99.99% of which are completely redundant. Whereas if you trace them from the screen back to their origin you don't end up burning processor cycles in vain.

        But I may have that wrong - I'm a real-time graphics guy ; )
    • by samhalliday ( 653858 ) on Saturday December 13, 2003 @10:30AM (#7710207) Homepage Journal
      are you serious? well... the "before" looks more realistic if you imagine it as a chocolate monkey... but its supposed to be bronze; in which case you must be mad if you think it is more realistic than the "after" image!
  • Raytracing on Linux (Score:3, Informative)

    by Anonymous Coward on Saturday December 13, 2003 @09:41AM (#7710082)
    Raytracing on Linux is already usable. Apart from POV , which AFAIK can do raytracing, Realsoft 3D [realsoft.fi], a new version of the old amiga Real3D (that also did raytracing, not just scanlining... WAY back in the 1990s...) is available for linux.
    • "Apart from POV , which AFAIK can do raytracing"

      LOL! Yes, POV can do raytracing. It's also a Monte Carlo raytracer (ie, Photon Mapping) and a Radiosity renderer, as well.
    • That's got a very nice rendering engine as well as rastering, and a user interface that 3D Studio Max affectionados will easily understand. It'll export to POV, which is handy sometimes as you can model a tricky object in the GUI, then import it into a POV script later.

      Oh yeah, runs on Windows and Mac too. Guess someone might find that useful :)

      http://www.artofillusion.org

      Vik :v)
  • by greepoman ( 720528 ) on Saturday December 13, 2003 @09:49AM (#7710100)
    If they were really serious they wouldnt use a monkey, who's impressed by a monkey? Use a dragon or a hot chic like that butterfly girl nvidia uses. I mean come on!
  • Mirror (Score:5, Informative)

    by linux_warp ( 187395 ) on Saturday December 13, 2003 @09:51AM (#7710106) Homepage
    Blender.org was being hit hard so I mirrored the before and after pictures on my website, mindwarp.net [mindwarp.net]

    Before [mindwarp.net]
    After [mindwarp.net]
    • is it just me... or do the monkeys in those two pictures look like there made out of different materials? the first image is completely dull, while the second is shiny enough to see reflections off of.
      • Re:Mirror (Score:3, Informative)

        by visgoth ( 613861 )
        That's the whole point of using raytracing. The 1st image looks dull because its not reflecting anything. One could generate a reflection map and use that to approximate environmental reflections for the first image, but it wouldn't be very accurate.

        My one issue with the second image is the lack of soft shadows. The shadowmaps in the 1st image look nicer than the overly crisp raytraced shadows in the second. If they could implement area lights or some other soft shadowing technique for raytraced shadows it
  • by Qbertino ( 265505 ) <moiraNO@SPAMmodparlor.com> on Saturday December 13, 2003 @09:57AM (#7710118)
    FYI: The monkey on the pictures is called "Suzanne" - she's a girl - and is the mascot of blender. This year the Blender "Suzanne" awards got handed out as a small bronze statuette of the very same shape you see rendered on the pictures.

    Further down somebody talks about more features (shader, etc.)
    This was a big issue with the 'future developement talks' at the blender conference this year. Software design issues were discussed and different approaches were evaluated. This years suzanne animation award winner has designed a shader tool that will be integrated into / act as a interface/usability reference for the big blender 3.0 redo. Which will have a shading enviroment integrated. Some other major parts of the new stuff will probably make extensive use of the Yafray raytracer and the basic design that went into it.
    Far out dreaming into the future led to considering a solid interface to the OSS crystal space 3D engine as to bring back the closed source realtime stuff into blender and provide a professional editor and design tool for crystal space.
    The problem with that is that CS has a totally different structure than recent and current realtime solutions in blender, so this only is an option after the big Blender 3.0 redo that will shed all the dirty hacks and establish a solid software design to the Blender codebase.
    So goes the plan for blenders future.
    Can't say no to this karma-whoring, can you? :-)
    • This year the Blender "Suzanne" awards got handed out as a small bronze statuette of the very same shape you see rendered on the pictures.

      Out of curiosity, how was the statuette produced? Was the output of blender used in the making of a mold or numerically-controlled machine instruction file?

      I ask because I've begun wondering if blender could be used for modeling things that would ultimately be made in the real world, like prototypes of enclosures, mechanical gears, etc.

      • Suzanne is modelled in Blender - that's for shure. Probably somebody from the core blender team exported the model to AutoCAD or something 'prototyping industry compatible' to make the mold.

        Ton showed a picture of the box that came with the statuettes all broken of on the standing leg. Apparently the dutch company that had made them had forgotten to fill the box with stuffing material. Suzannes standing leg was thickened and then the statuettes were to be remade.
        Apart from the bad packeging they where defi
      • I'm almost 100% sure it was with these guys:
        3D Art to Part [3darttopart.com]
  • by Pope Raymond Lama ( 57277 ) <<gwidion> <at> <mpc.com.br>> on Saturday December 13, 2003 @09:57AM (#7710120) Homepage
    I just had a blender workshop this week, and I can say: What a piece of software!

    It si just great..definetivelly not ewasy to learn on one's self from the ground up. At elast not with another miriad of multimedia packages that come in any modern distro.

    I have always being a POVray fan, and I'd say that some kinds of work I still could do faster on POVRay than on Blender, but it is great to see even more features in it.

    Anyway, blender is wellcome to the team.
  • This is a pretty funny headline, for me, considering I just came out of a weeks worth of 16 hour a day straight programming to implement an advanced raytracer for my graphics course. What did they use *before* the raytracer? Ray tracing allows you all kinds of gorgeous real-world effects, like wavelength dependent refraction, shadows, and lots of lens effects.
  • by Pig Hogger ( 10379 ) <(moc.liamg) (ta) (reggoh.gip)> on Saturday December 13, 2003 @10:37AM (#7710236) Journal
    ... is a way of outputting POV files.
  • by Adam_Trask ( 694692 ) on Saturday December 13, 2003 @10:37AM (#7710241)
    Comparing the before and after pics is like an apples-to-oranges comparison. It is technically not "before and after". It should instead be called xyz rendering and ray-tracing. Depending on what the xyz rendering method is, the picture quality will vary. The reflections etc. that can be seen in the ray-tracing image can be reproduced (not exactly, though) using methods like environment maps.

    I am not trying to put down the quality of ray-tracing though, it is the best. Others try to simulate ray-tracing. But folks rarely use ray-tracing in interactive settings (like gaming). Unless you can play at less than 1 fps.

  • here [elysiun.com]
  • by Picass0 ( 147474 ) on Saturday December 13, 2003 @11:41AM (#7710583) Homepage Journal
    As a blender user and foundation member I've watched as the features, inovations, and rate of development have gone up since it went open source and GPL last year. We have tons of new python scripts, a new gui that continues to evolve, better rendering...

    It's not Maya, but it's on par with anything in the low-mid range for windows, and it's getting better by leaps and bounds

    Give Blender a couple years and we might Hollywood contributing code. Hollywood lves Gimp, and I could see this becoming a real player in 3D.
    • by digitalhermit ( 113459 ) on Saturday December 13, 2003 @12:07PM (#7710713) Homepage
      It's not Maya, but it's on par with anything in the low-mid range for windows, and it's getting better by leaps and bounds.

      Completely agree. Blender is a real workhorse now. It's stable, fast, and I can do a lot of things very quickly. In one demo I created an airplane complete with rivets, rusty tail section and bullet holes in under 15 minutes. It can easily replace many of the $15,000 setups used in news stations or even some low end studios.

  • by PixelSlut ( 620954 ) on Saturday December 13, 2003 @01:08PM (#7711018)
    Someone recently posted on NeoEngine [sf.net]'s forum a link to a development, both software and hardware, towards Real-time raytracing [openrt.de]. It's not yet a reality, but think about where it may be in only three years or so.

    Of course, people also already have photon mapping working on the most recent generations of NVIDIA and ATI hardware offerings, and I think I recall someone from NVIDIA saying at some point that they expected this to be able to work at interactive framerates sometime during the NV4x cycle of GPUs.

  • ... raytracing is almost usable

    Yes, all we now need is a complete rewrite of Blenders UI-from-hell(TM) so people finally can say the same thing about Blender.
    • You're a couple of months late there, pal. Go and look at the release notes for version 2.30.
  • I have been using Blender since 1.8. At the time I was working as a system admin for an arcitecture and GA firm. They were using a special packge designed for ALPHA processors and 3D studio for everything else.

    Blender had that year of non-development and was stuck at 2.23 until NAN was able to get the donations to free the code.

    Since Blender has come a long way adding in Quicktime export, a new interface, NTSC (16:9 HD) rendering size. Granted the Game engine had been removed, but still it was commin

    • Out of curiosity, why did your company consider it a bad thing that Blender exported to a 3rd party app for raytracing?
      To me, it would seem like this is an advantage, because then you aren't locked in to one raytracer.
      Plus, Povray is AWESOME. I was hopelessly addicted to making a recreation of a Dr. Seuss scene. It was my first real project using povray (aka non chrome ball over green and white checkers) and it turned out VERY decently!
      povray is VERY professional grade software. check out these [povray.org] for what
  • by Afrosheen ( 42464 ) on Saturday December 13, 2003 @03:11PM (#7711678)
    Where are the before and after pictures of the server pre-and-post slashdotting?

    Figure 1. A normal webserver
    Figure 2. A molten, smoking mass.
  • by lowkster ( 546516 ) on Saturday December 13, 2003 @03:16PM (#7711698)
    Ray traced monkeys? No way! Where's the chrome sphere floating over the black and white checkerboard floor?
  • Hybrid Renderer (Score:5, Informative)

    by dcuny ( 613699 ) on Saturday December 13, 2003 @03:19PM (#7711708)
    Actually, the beta that Ton posted is a hybrid renderer. It's still primarily a scanline renderer, but you now have the option of using raytracing for shadows and reflections.

    Ton's had the raytracer written for some time now, but it never got incorporated into Blender. The preview is the first to incorporate the code.

    You could already do shadows and reflections in Blender, but they were simulated with shadowmaps and reflection maps, the same way that Pixar's Renderman renderer had done it.

    • Worthless trivia: Renderman only recently acquired raytracing - for the few times that it was actually used, they used the BMRT (Blue Moon Rendering Toolkit), a raytracer developed by Larry Gritz. Larry quit Pixar and formed ExLuna, which marketed another Renderman compliant renderer. Pixar sued ExLuna for IP infringement (the exact details are hazy, since they came to an out of court agreement), ExLuna was bought out and all the renderers (including BMRT) disappeared. Soon thereafter, Pixar's Renderman added raytracing support. Still, full raytracing is used in Renderman quite sparingly.

    The Yafray [yafray.org] (Yet Another Free Raytracer) is a stand-alone full raytracer with a lot of features that has nice integration (thanks to Python scripting) in Blender. Future versions of Blender promise to integrate it more tightly, and seems more likely that's where a 'full raytrace' option for Blender will come from.

  • Art of Illusion (Score:5, Interesting)

    by UpLateDrinkingCoffee ( 605179 ) on Saturday December 13, 2003 @03:24PM (#7711727)
    If you guys like blender, you might be interested in another project called Art of Illusion [artofillusion.org]. It is a poly-based modeller and renderer and I have seen some amazing [housepixels.com] results. It's completely open source (GPL I think) and achieves great performance being written completely in Java. Check it out, and also the other work Nate [housepixels.com] has done.
    • Yup, I use ArtOfIllusion [artofillusion.org] for commercial artwork. It is indeed most exellent and powerful. Not only that, but there is some great documentation available for it too.

      It would be great if Blender and ArtOfIllusion could share a decent file format. It'd save everyone a lot of heartache in the long term.

      It has rendering and raytracing options, so both camps can be kept happy. Oh, and don't be put off by its use of Java. This is by far the speediest Java graphics app I've come across anywhere.

      Vik :v)
  • by Billly Gates ( 198444 ) on Saturday December 13, 2003 @03:57PM (#7711901) Journal
    Emacs has supported raytracing for years.

  • by UglyMike ( 639031 ) on Saturday December 13, 2003 @06:42PM (#7712695)
    Don't forget.... there is a new 600-page book [blender3d.org] coming out end of this month. This has the new GUI but not the Raytracing part. There is also a Japanese book [ohmsha.co.jp] currently in print. Details on Blender.org [blender.org]

    Copies of the 2.0 Blender book can still be fond in some shops or simply downloaded as a PDF [blender.org] (of course, this one doesn't cover armatures and has the 'old' interface) There is also a newer documentation project [blender.org] using the 2.0 guide as base but completely reworking the obsolete content. Of course, there is also a truckload [nuance9.com] of tutorials available on the Net

    Since the move to Open Source, Blender has gotten, amongst others

    • internationalisation
    • way better meta ball implementation
    • knife tool
    • raytracer (reflections&shadows)
    • completely reworked GUI (and still changing)
    • a newer, better Python API and plenty of great scripts ( Fiber2, MakeHuman,Tesselate,...)

    These are just my favorites. There is tons of other stuff as well.
    In the coming weeks/months, we'll see

    • Beast script (including card-based fur just like IceAge)
    • better nurbs based on Nurbana
    • integrated bevel tool (script-based bevel already exists.)
    • Integrated REAL raytracer (YafRay)
    • further tuning of the new GUI
    • ???

    And the whole thing runs on most of todays's OSes
    As you can see, lot's of stuff to go around. It might not be Maya or SFX or Houdini but it sure is a lot more fun!!!
    If your first encounter with Blender's non-standard GUI made you trow up your hands in disgust, you should consider to try it again.

A morsel of genuine history is a thing so rare as to be always valuable. -- Thomas Jefferson

Working...