Sorry, but pointy jabs at OSes are well deserved in this case (ex-film industry guy here). Linux is used extensively within the film industry, but each studio requires a small army of linux gurus to patch and modify the OS and kernel just to keep the OS from constantly falling over. Whilst none of the gurus complain (they get paid a healthy salary), it's a real shame that an artist simply cannot perform these tweaks themselves (recompiling a kernel is not for the faint of heart!). You'll also find a few Mac pros knocking about, but there the problems are just plain ridiculous. The lag between new OpenGL version & GPU features, v.s. the adoption into OS X is just insane. If you're predominantly linux, with a few hundred mac OS X boxes, it's kinda nice to be able to provide the same toolset to users on both platforms (As an R&D programmer, my role was to help improve the performance of art tools). Sadly, if you have OS X in the mix, this becomes extremely unpleasant. You end up with the high performance version on linux (leveraging any GPU feature available to give the artists the ability to work on scenes with hundreds of millions of polygons), and then you have the crippled OS X version that craps out after 10 million (even though the GPU used in both machines is identical). Windows isn't without it's problems (being effectively stuck with a single user->single computer mindset), but at least you can still exploit the underlying hardware. The reality is, if you're a creative professional, working with computers is still a massive ball-ache. It's a shame that people who write the OSes haven't really put much consideration into figuring out how their users actually use the things.