I have developed scripts that manage my backups. Because I'm always experimenting with computer systems and apps, I make 100% backups, every day, for every computer. Each computer makes it own scheduled backup, copies it to one central system, then shuts down.
In the wee hours, the central system (an old, low-power XP box) makes it own backup, and then copies ALL the backups for that day to an attached external 1TB drive.
The central external drive has a hierarchy of backups (e.g.: P:\Backup\Backup\Backup). When each computer makes its' backup, it starts a copying process. That process makes sure that any older backups for that specific, named system are pushed down in the queue, and the oldest one is discarded THEN, I copy this evening's backup to that drive.
I have three 1 TB drives: One is connected to the central system and hold "this weeks' backups" (depending on how often I decide to change it); the next drive is the one most-recently retired from service, held nearby in case I have to go back several days or a week to find something; the third drive is stored in a safe place, off-site, so even if my building burned down, I've still got a lot of backups I could use to restore new computers from scratch.
When last months' MS Windows Update fiasco struck, all I had to do was restore the C: partition on the affected machines from last night's backup, and I was back in business without a hitch.
Finally, the reason I wrote these scripts for commercial backup software is that if backups aren't completely automatic, they'll never get made, so you won't have the critical data to recover when you need it. I've been thinking about reprogramming the CMD scripts in another language, to commercialize it, because loss of critical business (or even personal videos, photos, etc.) data is still a problem for those who choose not to use up all their bandwidth on a "cloud" service (although that could be easily added). It may sound like overkill to some, but I nearly NEVER lose my O.S., configurations, apps or data.