I know it seems like a simple question, but the answer is a bit complicated.
First, the easy part: hardware. You'll want to upgrade/replace hardware for a few different reasons, one being improved performance or new features. Also, inevitably, all hardware eventually breaks, so you'll need to replace it eventually or just cope with its loss. Often, if you're dealing with important hardware, you want to replace it before it actually breaks.
I know, you're thinking, "Why?! That's stupid. If performance is fine and it's still working, why replace it?" Well, in short, it comes down to warranty/support issues. First, if you have a brand new server, the chances of some hardware component failing is a bit more slim than a 12 year-old server. There hasn't been any wear and tear on it yet, so outside of a straight-up manufacturing fault, you'll probably be fine for a while.
But if it does fail, you often have some kind of warranty in place with an appropriate response time. So if I have a brand new Dell server, I can have a warranty with Dell that says if a some hardware component fails, I have a replacement part in my hands in under 4 hours. With a 12 year-old Dell, Dell might not even carry a replacement part anymore. I have to call up and find out, and I'm going to pay for whatever limited support that I get.
So if you have a computer, and you're thinking, "Well if it goes offline and takes a few days or a week to get it running again, that's fine," then by all means, run it until it breaks. If you don't want downtime, plan to replace the hardware every 3-7 years. For a lot of businesses, the potential loss in productivity of an outage is not worth the money saved by not replacing hardware.
Beyond all that, keep in mind that I'm saying "plan to replace hardware every 3-7 years". That doesn't mean that you must absolutely replace all hardware on that timetable, but you should sure as hell budget for hardware replacements. If you're running a business and you have an old out-of-warranty business-critical server that you can't afford to replace if it breaks, then you're in a bad place.
Software is less obvious and potentially harder to explain, but the easiest part of the explanation is, again, regarding "support". Windows 7 and Windows 8 have security patches coming when a new exploit is discovered. Windows XP doesn't. Why doesn't Microsoft just continue to support XP? I'm no fan of Microsoft, but I'd suggest that the reason isn't some kind of nefarious manipulation. It's simply that they don't want to keep supporting all the quirks and bugs of an application that was built over a decade ago, filled with legacy code and bad decisions.
But aside from the simple issue of "security patches", there's a more subtle issue that people don't talk much about, but every IT guy has in the back of his head: there's all kinds of crap being built for Windows 8 and Windows 2012 right now. If someone is writing new drivers, they're going to write drivers for the new OS. If someone is testing their new software version, they're testing against the new OS. If Microsoft developers are looking at a piece of code and thinking, "This is kind of buggy and unreliable, but fixing it would mean overhauling a lot of code..." then those improvements are going to be in the new OS.
So if you want things to be reliable and work well, you generally don't want to be on the bleeding edge (where things aren't tested well yet), but you also don't want to fall too far behind (where nobody is bothering to test anymore). And you know this too, I'm sure. If you're running Linux, you probably don't want to be running production servers on the kernel released yesterday, but you also probably don't want to be running them on the kernel release 12 years ago.
There are more reasons than these, but these reasons are good enough.