"doesnt this really just suggest that windows servers need regular replacing to keep doing their job while old unix hardware keeps doing its job just fine?"
If you are making a living in IT you know that you are still replacing servers as they roll off warranty and as they are fully depreciated. I'd no more put one of my Oracle databases on an old Linux machine than an old Windows machine. Requirements always go up, not down. Saying you can run Linux on older hardware is a misleading statement.
I suppose if a company is using Linux because it was free, or using UNIX of some form because it "runs on older hardware" they get what they deserve anyway - that's not the way to run an IT shop.
The change is likely due to the increase in blade-type systems which are well suited to a Windows environment. You can use a UNIX server environment and have interoperability with the end-users' desktop systems and the domain security model, but when you can just plug another cheap blade in and not have to worry about a third party authentication scheme, it makes Windows a pretty easy choice. Some of the arguments posted about not being able to run more than one app are not a shortcoming in the OS but rather a shortcoming in the developers. Plus, who cares if you need 5x$1000 blades to run 5 apps on Windows? It would cost more than $5000 to get the same sort of horsepower in a UNIX box.
Tools my friends, these are just tools. They don't know or care if you religiously defend them. Your IT careers will be more successful if you learn to use a variety of tools, each what is appropriate for the job.