Overclocking by 1-5% gives you how much of a performance benefit, really? You're far more likely to get a decent speed increase with simply better drivers and more efficiently written games.
For the car analogy; just because your car's RPM meter goes up to 10000, doesn't mean you should hammer it at that constantly, or even ever. A racing driver would never keep his car running at full tilt, nor even try and get there, because the simple fact is that when you do the whole thing becomes unstable and the engine generally starts to burn. Formula 1 car engines are put through rigorous testing to make sure they can actually perform at the performance level they need to, but once they meet that level, they push it to see when it will die. Then they work out the safe level for it to be at and the driver knows this exactly.
What you never see, though, is the results of this, neither for the car engine (trade secret) or the results from the tests at the fab. So YOU have no clue whatsoever what that safe limit really is.
Being in the electronics industry I fully understand WHY these chips go through bin sorts and so on, and WHY you shouldn't actually do anything with them. A good example is the chips Apple used in their late-model PowerBooks. The MPC7447A chip they were using NEVER shipped at higher than a 1.4GHz clock speed. Freescale actually specially sorted out chips which would run at 20% greater voltage and 1.66GHz at the cost of only being specified for a 5 year (standard 10-year) running life at a junction temperature of 85C (standard 105C). They're laser marked on the die, still, as 1.4GHz chips.. this is a case where overclocking is okay, in fact even sanctioned. But they draw a ton more power, generate a ridiculous amount of extra heat, and require a cooling system to match. That's the price of 233MHz extra processor performance.. barely 10% in the real world.
Get a CPU or GPU that says "1.0GHz" on it, and yes, it may run at 1.1GHz with a bit of tweaking, it may even run at 1.5GHz if you are clever with cooling. It may be that your graphics card is one of the lucky ones that can take a good 5% or even 10% clock speed increase, or your RAM may overclock FAR past the JEDEC specs it was designed to meet (however a vast majority of RAM is designed to meet it to within very strict limits, to the point that they use older processes and cheaper production techniques and get it within 1% tolerance, IF THAT - because this is exactly what the JEDEC standard says it has to be). But there is a chance your chip has been validated and tested at that clock speed and voltage, and badged at that clock speed and voltage, for the simple reason that it will not be reliable at anything higher, confirmed by the burn-in at the fab.
Since you simply cannot tell what the actual tolerance is (and those "render something real fast until it starts corrupting the display" tests actually serve to damage the chip), it may well be that your chip could be only capable of running at 0.5% past it's rated value, it could be the lucky 10% chip. You're taking a big risk in even trying, and in the end, getting half a frame per second out of some game isn't worth losing a $300 graphics card over. So you want $3 extra bang? $30 extra bang? Come on..
As for what computer I own; it's a stock Asus P4P-800 which is perfectly good for overclocking, a Pentium 4 HT 2.4GHz which is fine at 2.4GHz, and an ATI Radeon X800 with OverDrive turned *off*. I also have a VIA EPIA, PowerPC G4 (not a Mac, just a board with a G4 in it), PowerPC MPC8641D (which has all the switches to let me configure the entire gamut of bus ratios and core clock speed), a bunch of other embedded chips (PPC, ARM) and a Vaio laptop. I've never had a Mac in my life..