Forgot your password?

I back up personal files...

Displaying poll results.
  5341 votes / 18%
  2684 votes / 9%
  1873 votes / 6%
  660 votes / 2%
Whenever I remember
  5982 votes / 20%
When I save something important
  2960 votes / 10%
  2007 votes / 6%
Shortly after my computer breaks
  7875 votes / 26%
29382 total votes.
[ Voting Booth | Other Polls | Back Home ]
  • Don't complain about lack of options. You've got to pick a few when you do multiple choice. Those are the breaks.
  • Feel free to suggest poll ideas if you're feeling creative. I'd strongly suggest reading the past polls first.
  • This whole thing is wildly inaccurate. Rounding errors, ballot stuffers, dynamic IPs, firewalls. If you're using these numbers to do anything important, you're insane.
This discussion has been archived. No new comments can be posted.

I back up personal files...

Comments Filter:
  • by captainpanic (1173915) on Friday April 08, 2011 @07:03AM (#35755700)

    I don't have many personal files, but emails get backed up automatically. Photographs get backed once I download them from my camera (and sometimes I even burn them onto a dvd). And other personal files often aren't important, and will only be backed up if I find the time.

    -- Of course I will complain about lack of options.

    • by xaxa (988988)

      When I turn on my personal PC it's backed up to my "server" using rsync, and the server backs up my PC (these computers are 150km from each other).

      Rsync's hard links option makes it easy to take a backup every day of only changed files. I keep any backup that is (of the most recent five) | (less than two weeks old) | (taken on the 5th, 15th or 25th).

      I also have a PC in the garage (hidden in the corner) that has its BIOS set to turn itself on on the 1st of the month and back up everything from the server (t

      • by xaxa (988988)

        In case anyone finds it useful, here's the most important bit. It's written in Zsh, you'll probably need a few extra lines in something else.

        It takes a backup using Rsync, creating hard links to any files that haven't changed

        # array of directories, ordered by modification time (most recent first), if none found don't error
        older=( $backups/$user/*(N/om) )

        rsync --archive --recursive --fuzzy --link-dest=${^older[1,20]} $user@$remotehost:/ $backups/$user/$date/

        The --link-dest=${^older[1,20]} expands to --link-

        • by Anrego (830717) *

          Or for those who don't want to roll their own.. there is also rsnapshot

          I have multiple layers of backup. The primary machines involved are my desktop (which has a small, barely filled SSD), my 12TB file server, and a secondary "backup" file server.

          A full backup of my desktop is done weekly to the file server. Key directories (containing "irreplaceable" stuff) on my file server are backed up at various intervals to one of two backup drives. Basically I leave one backup drive connected, and have the other els

          • by Pieroxy (222434)

            I have set up an offline automated backup with a couple of friends. Here is how it works:

            First off, locally I have a server that will back up on a USB drive critical stuff out of various machines on my network (using rsync). This includes SVN repo, docs such as my resume, pictures, mp3s and a snapshot of my web server HDD including a mysql dump. Videos are out for obvious reasons. I don't feel like rsyncing 100TB over my DSL line.

            I have two friends that have each a linux machine on which I have an account.

        • Heres mine, in bash/sh. use by creating an rsync include/exclude file named as fqdn.rsync, then run fqdn.rsync. GPLV2


          PROG="$( basename $0 )"

          run_backup() {

          DATE="$( date +%Y%m%d_%H%M%S )"

          rm -rf -- "$BACKUP_DIR/$HOST/tmp/"

          mkdir -p -- "$BACKUP_DIR/$HOST/tmp/"

        • by 1s44c (552956)

          I used much the same until I discovered BackupPC. It uses rsync and links and compresses identical files.

          It's pretty easy and flexible.

      • by operagost (62405)

        I also have a PC in the garage (hidden in the corner) that has its BIOS set to turn itself on on the 1st of the month and back up everything from the server (this uses the same scripts as the other two PCs). This provides an off-line backup.

        This is a clever use of that BIOS feature!

      • by rcpitt (711863)
        My "PC" (Linux Workstation) doesn't control the backup - the backup machine does - just like it controls the backup for my public-facing server at a colo and several servers of friends and customers (those who don't have their own). All the systems are on 24x7 anyway.

        rsync and at least 10 iterations retained (up to 999 for some directory trees) on RAID 10 arrays - with offline backups (snapshots) on older drives rotated out of service from time to time.

        With Terabyte drives at $50 raw, it simply does not

      • That remote backup trick is quite clever. I do the same rsync setup as you (using rsnapshot), but that's mostly to protect me from myself, since the most likely need for a backup is me stupidly overwriting something. I figure most anything that takes that out will probably kill both machines, so for my offline backup I make a copy to an external hard drive and put it in the fire safe every 6 months (when I change my clocks, and check my smoke alarms). One of these days I'll have to scrounge up a spare ma

    • by zrbyte (1666979)

      Multiple answers?

      You mean you usually only vote once on /. poll?

    • by DamienRBlack (1165691) on Friday April 08, 2011 @02:29PM (#35761786)
      I backup my files every time this poll shows up on slashdot. Keeps me fairly safe. I guess it is time for another round of backing up.
  • Most of my files are backed-up instant with an encrypted Dropbox-folder...
  • by jhigh (657789)
    I think that the concept of personal backups is far different than what it used to be. I can remember backing up all of my personal files, photos, etc. to DVDs about once a month a few years ago. It was something that I had to make time for, remember to do,etc. Today, though, everything that I care about is on a 1TB external hard drive. While I realize that this doesn't technically count as a backup, I think that for most home users it fits the bill. I am able to reinstall my operating system or replac
    • While I realize that this doesn't technically count as a backup

      I think even non-technically, this does not count as a backup as the data is only stored once!

      • by hedwards (940851)

        Indeed. What I've found works well is using Crashplan [] to back my data up to an eSATA disk on my desk and make an identical copy online. It takes quite a bit of time to do that, but it means that I've got a backup locally and if something should happen to that, I've got one online which can either be shipped to me or I can just download.

        It's not like perfect, but the amount of important files that are likely to be lost is basically zero, it would have to take something unimaginably significant to cause me to

    • I use older external hard drives as a backup. So I've got my current 1TB external data drive, and use my older 500GB to back up the important stuff off that. Once I get around to getting a bigger drive, the 1TB becomes the backup drive, and the 500GB goes idle.

      • I do the reverse. My newest, largest drive is the backup. Next biggest is /media which has my music and video, then the smallest is /home. As I get new larger-capacity drives, I rotate them in as the new backup and the old backup becomes /media, old /media becomes /home.

        That way I can back up all of my data plus system stuff (/usr/local and /etc).

        I buy the new drives in pairs and use one as a weekly off-line.

        Also, rsync FTW!

    • by Pieroxy (222434)

      If everything important is on only ONE drive, you will invariably lose everything one day. The fact that the drive is external actually is an aggravating factor, because it can fall down.

      The principle of a backup is to have stuff stored in at least two different places. This accounts for hardware failures and user mistakes. To be safe from bigger disasters (electrical surge, fire, robbery) you need an off site backup. This is another copy of your data stored in another place (like in your parent's home for

      • My ISP offers a 500GB external hard drive for ~$10/month, which includes full continuous backup to their servers. Anything you put on that disk can be mirrored off-site automatically and can be accessed from anywhere through a secure website if you enable it.

        I got it and haven't had to worry about backups since, all my stuff is backed up on the external drive and on my ISPs servers. That's at least three levels of backup if you include the nightly tape backups at their datacenter. Pretty sweet service for w

    • by rcpitt (711863)
      backup = 2nd copy (at least)

      having your only copy on an external - no matter what it is - is silly. Today's drive prices are so low that unless your data has 0 value, you should have a second copy on a second drive.

      local company near here is selling USB->SATA external bays for under $10 each, and a 1 TB drive is $49.95 at local computer store (7200RPM Seagate)

      bottom line - today it is only sloth and/or ignorance that prevents anyone/everyone from having a backup solution that is offline and portabl

      • by 1s44c (552956)

        bottom line - today it is only sloth and/or ignorance that prevents anyone/everyone from having a backup solution that is offline and portable.

        Or stupidity. Some people blindly refuse to take backups no matter what. I think these are the same kind of people who take pride in their ignorance.

        I'm not referring to the poster above BTW.

    • by 1s44c (552956)

      Today, though, everything that I care about is on a 1TB external hard drive.

      I had a 1TB external USB drive, the disk broke.
      I also had a 1.5TB external USB drive, that disk broke as well.
      And I a 60GB internal SSD, guess what? It failed.
      I also have a box of 40 or so failed disks from various workstations and servers at work.

      You can't trust a single disk with anything you want to keep. Sooner or later they fail.

      • The moral of the story is: if you want you want to keep your data safe, for God's sake don't give it to 1s44c.

  • Daily (Score:5, Informative)

    by XxtraLarGe (551297) on Friday April 08, 2011 @07:30AM (#35755852) Journal
    It's so easy these days, why wouldn't you? Time Machine on the Mac makes this type of thing pretty simple, and external storage is starting to get pretty cheap-â"2 Terabyte external drives for under $100. I imagine similar ease of use backup solutions exist for Linux & Windows.
    • by mlts (1038732) *

      Windows backup abilities vary; Windows Server 2008 and Windows Server 2008 R2 have decent backup utilities. Windows 7 can back up with images and files in the top tier versions. The lower tier end up only saving files off.

      Linux, you can use one of many solutions.

      Because Windows backup programs vary, I highly recommend a third party utility. Acronis is decent, however I always push Retrospect because it has the best functionality in a backup program this side of NetBackup, TSM, or Networker. Only downsid

      • by afidel (530433)
        I'm a big fan of cloud based backup solutions like Carbonite/Crashplan/Mozy/Windows Live Mesh. They all have some drawbacks but for not a whole lot of cash you can backup your important stuff automatically to something that's not going to burn up if you have a house fire or get trashed in a flood or blown away by a twister, etc.
        • by mlts (1038732) *

          They are decent solutions, but they are not magical. For most home users people , having an external drive with Time Machine coupled with a cloud backup solution for documents [1] is good enough.

          [1]: I would recommend using the encryption feature the cloud provider has. For example, I use a keyfile with Mozy, and the keyfile is stored encrypted on a private Linux VPN server. This way, if I have a complete disaster, I fetch the Mozy keyfile, decrypt it with gpg, then use it to decrypt all the rest of my

        • by rcpitt (711863)
          This is what friends/relatives (and encryption) are for

          the cloud is a good place to put stuff you want Big Brother to dig through - or lose because the cloud owner goes tits-up.

          If you really want to store you stuff "in the cloud" then encrypt it into a RAR archive, call it some sexy name and post it to a XXX news group - it will cycle for a while at least - re-post weekly - no storage charges ;)

      • by 1s44c (552956)

        Have you actually used Symantec NetBackup or EMC NetWorker?

        I have and they both caused no end of pain. They are old technology at a premium price.

        Rsync and a load of cheap disk is better in just about every respect. BackupPC is better at everything but you will end up with very many instances.

    • Time Machine is awesome. My always on mac mini media server backs up to a hard drive attached to the airport router. It's nice knowing all my video/photos/music is backed up every hour with zero intervention.
    • by Kulaid982 (704089)
      I agree that Time Machine is very simple, straightforward and handy. I've used it to both restore files and settings after a hard drive replacement, as well as find an older version of a current file.

      I currently have a Mac Mini Server with a 2TB external drive. Time Machine backups of the Mac Mini Server go on the drive. Additionally, with my wife and I have user accounts on the server, so our Macbook Air/Pro laptops are also "Time Machined" to the external drive thru the server. The only worries I h
      • Time Machine probably would not work over the internet (I guess it might if you set up a VPN, it just needs to be able to see the other system via broadcast over the network).

        For what you have though, I think instead I'd set up a simple rsync script between your home computer and the remote one that backed up your TM drive. Or possiibly Super Duper or Carbon Copy Cloner would work over a drive you had mounted over the internet or pointed at a remote IP, both have free trials.

        What I do is somewhat more manu

    • by Phat_Tony (661117)
      I agree. And it has to be easy, or most people won't do it. Anything that requires manual input is likely to fail. And I think most people radically underestimate the probability and consequences of losing all their data if they don't back up.

      I have a script nightly mirror my entire drive to a bootable copy, and then I ALSO have Mozy for all my files, because my mirror won't do me much good in case of fire / flood / break-in where they steal the computer and backup drive.

      And doing those two things is
    • It's funny. I have my home machine setup to backup nightly (which are rotated into weekly and monthly logs) using rsnapshot. It operates on a similar principle as time machine; uses hardlinks to perform incremental backups but with full, live directory structure, but unlike OS X, Linux doesn't support directory hardlinks so it is somewhat less efficient. I just spent a few hours researching my options and buying a big drive on newegg, then a few more hours learning how rsnapshot worked and setting up the c

    • Time Machine cost me around 3 months of work around 2 years ago.

      Seemed that it decided that going back 3 months and forgetting everything that had happened was a great idea.

      No way I'd even use Time Machine again, let alone trust it for backups.

  • Direct work from the laptop/desktop is backed up on an infrequent basis (about once a week) so that I can manually trim the fat from the home directory before backing up each time. Once backups are taken from the laptop/desktop they go into a RAID 5 array on a stable, local server (yes, yes, redundancy is not a backup). A second server with a RAID 5 array then performs rdiff-backup snapshots of the data each night, allowing rollbacks to day one. Personal data which would be difficult or impossible to repro
    • Redundancy most certainly IS a backup! The question is what kind of disaster do you want to be able to recover from? Copying a file to another directory will likely protect you against accidental deletions or typos. Copying to another hard drive will protect you against drive failure (and RAID 5 effectively copies your data across multiple drives). Copying to another machine will protect you from a motherboard or power supply failure or viral attack. Offsite backup will protect you against fire, theft,
      • by mlts (1038732) *

        Redundancy means your MTTR is almost 0 (assuming a hot pluggable drive). A RAID array will happily copy malware, delete requests, or corrupted files across the drive array.

        Here is one recommended approach:

        First, your RAID array, mirrored volumes for your boot partition, RAID 5 or 6 for everything else. This protects you against hard disk failure.

        Second, an external drive or array which data gets copied to every so often. This protects you against controller failure, deleted files, corrupted directories,

  • by The Grim Reefer2 (1195989) on Friday April 08, 2011 @08:06AM (#35756146)

    Who needs to be able to check more than one of these options. I have an external HD that backs up every night. Then Two others that are done on a monthly basis. One of which gets swapped with a drive in a safe deposit box. As well as one that also goes in the safe deposit box on a more or less yearly basis.

    • *raises hand*

      Although, I consider the one in the fire safe to be the most important, and I can't even answer for that since it's every 6 months.

  • All my files are under version control, you insensitive clod!

    For a while now, I've been using a private git repository on a VPS (~$20 month). The only things I could possibly lose are only a few days old - the latest revisions. Worst possible case: I have to reinstall OS and a few free programs and lose ~2-3 days work. Since it's a private repo I can be sloppy and commit half-working branches, which I do when a patch is growing unwieldy.

  • Important files are saved directly into my Dropbox. Backed up in the cloud and on my other systems the moment I connect to the Internet. Everything else gets backed up once a week.
  • Missing Option (Score:5, Insightful)

    by Even on Slashdot FOE (1870208) on Friday April 08, 2011 @08:31AM (#35756414)


  • Missing Option: Whenever I get a new hard drive.

    Over the years (usually when something breaks or when I build a new machine) I'd go get what was then considered a large drive and copy all my stuff to it. Depending on circumstances that drive would then become my main drive and the others would go offline or get mounted in a removable tray. The last drive I bought was 1TB, and I'm looking at multiple drive external boxes.

    The only problem is now I have trees of stuff like /storage/oldhdc/home/camperdav
    • by mlts (1038732) *

      This makes me wonder why it is that there are no filesystems that dedupe on a block basis (unless one runs Solaris, or has an EMC SAN at home), or even by files. For stuff like that (which I tend to do as well), it would be nice to have the ability to shovel the junk onto the drive, and if there are more than one copies of a file, who cares.

      • Well, you could use zfs, which gets you part way there with copy-on-write (and a variety of other awesomeness). For one good example, look at Nexanta - which is the Ubuntu userspace on a Solaris kernel with zfs support and some zfs-enhanced utils. To get all the way there, you almost need a periodic cleanup process - as just a slight offset at the beginning of a file screws up block-based detection of data which is otherwise duplicated but just not aligned to blocks the same way.

        Or, not at the block level

  • by DarthVain (724186) on Friday April 08, 2011 @08:32AM (#35756426)

    I don't need to, its securely on the Cloud!

  • Having worked in IT for 11 years and now as an ISP Admin, I preach backup to my customers and fellow co-workers. However, at home, I seem to have some trouble with keeping up on practicing what I preach.

    • by hedwards (940851)

      Depending upon your OS, I'd go with backblaze or crashplan, they both are quite affordable and do pretty much all the work for you. Apart from a periodic verification that it's working there's nothing to do. I give crashplan the edge just because it allows you to backup locally as well as online.

  • So far, I would say I don't really have a single piece of data that is that important that I need to do a backup.

  • Tape Changer (Score:4, Interesting)

    by DaveAtFraud (460127) on Friday April 08, 2011 @09:48AM (#35757428) Homepage Journal

    I picked up a used 6 x 12GB tape changer (HP C1557A) when one of the local "dot coms" became a "dot bomb." I think the tape changer cost me all of about $100. I bought enough tapes to set up a six week tape cycle plus a few spares. I think the tapes cost more than the tape changer. Amanda takes care of the rest by backing up the server plus my desktop and my wife's system five days a week (not quite daily). Since 12GB isn't a whole lot of space any more, I had to set up exclusions for things like copies of DVDs, downloaded ISOs, etc.

    Oh yeah, I also run RAID 1 on the server but RAID doesn't protect you from intentionally deleting the wrong file(s).


    • by mlts (1038732) *

      The best of all worlds would be to have a dedicated machine with a RAID array, where all your machines are dumped to in toto. Then from the backup server, it would just copy from the partitions to the tape drives locally what data should go onto tapes.

      This provides not just tape protection of critical data, but also protects the ISOs and such as well with a pseudo D2D2T setup.

    • +1 obscure
  • by e9th (652576) <e9th.tupodex@com> on Friday April 08, 2011 @10:13AM (#35757896)
    Thanks, poll, for reminding me!
  • by Tom (822)

    Time Machine :-)
    automatic incremental backup every hour. I've not yet had a total system crash, but I've used it quite a few times to restore files accidentally deleted, or even as a revision control system to revert a file to a previous state.

    Backups should be fully automated and trivially easy. I never did backups for my personal stuff before due to the hassle factor.

    • After putting in a new one and installing OS X, the installer asked if I would like to restore my Time Machine backup. With just a click it restored everything (user accounts, data and applications) exactly as it was before.

      My primary drive dying literally only cost me about four hours of downtime. 20 min to replace the drive (iMac, many Torx screws, first time), about an hour to install the OS, and the rest waiting for a few hundred gigabytes to travel over USB from the backup drive.

    • Time Machine is great. I too had trouble with doing backups before I got a macbook pro, and I have lost things due to hard drive failure (nothing extremely important, but things I wish I still had) - even that wasn't enough to make me particularly diligent. I'd try one solution or another, and never end up doing more than one backup because of the hassle.

      I'm not sure how well it handles backing up external hard drives, though. I store a lot of data (mainly photos, as a prolific amateur photographer) on exte

  • Or as continuously as Time Machine runs the incrementals.

  • Incremental backups every hour for a week, every week for 6 months, and every month for a year and a half.

    Love the Time Machine. I had much worse etiquette before I hooked up an eSATA drive and it "just works". Sure, I'm perfectly capable of scripting incremental backups using rsync or something, but Time Machine actually does it about as elegantly as I would anyway because it hard-links identical files, so each backup is a full root directory. And the interface is top-notch.

  • daily ... (Score:4, Informative)

    by Tux2000 (523259) <(alexander) (at) (> on Friday April 08, 2011 @12:06PM (#35759790) Homepage Journal

    My backup runs every night, using cron, rsync and some shell scripts, based on []. It writes all data on my server (mails, home directories, SVN repositiory, database, ...) to an external disk (eSATA). Hardlinks over 10 file trees for 10 days allow me to restore deleted or modified files for up to 10 days. Average backup time is less than 1 hour after the initial backup (that runs for several hours).

    Workstations and laptops contain the operating systems (can be re-installed from CD/DVD/PXE), files copied from the server (local copies of mails, SVN sandboxes, ...), and some temporary files not worth to be backed up. So, I don't need to backup those machines.


  • I went to update my wife's iPhone to 4.3.1 from 4.1 the other day. It locked up into 'recovery mode' and she had not backed up pictures/videos since October. Thankfully I was able to find a tool, iRecovery which allowed me to get it to boot the old firmware again and mount the USB partition to pull all the pictures of the kids from Christmas, etc. Backing up is now more important to her.
    • That's a shame. Backing up and restoring the iPhone is so easy it's a joke. With the exception of jailbroken apps, the entire phone can be brought back from a clean slate in the space of about 20 minutes - to the point where there's no reason to not wipe the phone and 'restore' to the new software instead of upgrading it, which seems to improve the performance. All there is to it is to just plug the phone into the computer a few minutes a week...

      No, I'm not a fanboi, but the iPhone is the only smartphone I'

  • Only when I get a new computer and I need to transfer everything over.

    I like to use things as efficient as possible and I just feel like backing up is a waste of space. So I tend to back stuff up and delete the original so it ends up being one copy anyway.

    I mean what are the chances that something would brea

  • I keep my personal files encrypted on dropbox which then shares the archive to several other machines. It updates for every change and Dropbox journals the file.

    If you want 2GB free space on Dropbox plus an extra 250 MB for using my referral link, here it is: []
    I don't have any other relationship to Dropbox, I'm just a happy customer.

  • I have a 'personal documents' folder which fits within my Dropbox limit, so those files are on the cloud and multiple computers. The only place I use files other than from there is on my main desktop at home, which has an external drive holding a Time Machine backup (one update/day so as not to age the drive as fast) and a nightly SuperDuper backup (SuperDuper is utter win). Finally, I have a 4TB RAID6 NAS at home where my non-critical files (ripped DVDs, ripped music) is stored as well as an automatic we

  • Weekly and automatically. The latter is by far the most important part. Designing a system without taking human nature into account gaurantees eventual failure.

  • Since "whenever I remember" is the most popular option, would anyone like a 99-cent app for their smartphone that reminds them to do backups?

    • Wouldn't the built-in alarm system do the job just fine? Still automated systems are far preferable, though checking to make sure they still work from time to time is wise.

  • has a pretty nice piece of software (Win, Mac, Linux) that does a pretty fine job of regular backups to locally available storage or (even better) remote storage on another machine running the software. They make their money trying to sell you space on their remote server farm, but the free features work great getting data from the home to the office and back again or backing up your parents machines onto your machine and your machine to theirs. I think the free version is limited to a daily b

  • I found IdleBackup [] a while ago and fell in love with it. It's no longer being actively developed, but it works ok on Win7 x64. I tell it which directories to back up, and it saves copies of any new or updated files to my NAS automatically. My next step is to set up a remote solution with my parents' and/or friends' houses to get offsite backup as well, but this at least protects my important files from anything that could happen to them on my PC.

  • I use Mozy Home. It's about $.17/day for my 50 GB plan with a biennial subscription. Advantages are that even if my house were burnt to cinders, I'd still be able to recover, and most backups are quick, unless I've dumped a few hundred new photos or something. Disadvantages are cost and recovery speed over a slow DSL connection.
  • Only wimps use tape backup: real men just upload their important stuff on ftp, and let the rest of the world mirror it ;) Torvalds, Linus (1996-07-20). Post to newsgroup
  • My main data HD is mirrored, so I back thinks up the moment I change something.

  • Network block devices, RAID1 with "write-mostly" option, an stunnel pipe, and a loopback-device mounte single file on a remote server. :)

    And daily backups to that RAID-1 device using [].

  • Meh, missing option : Continuously.

    Everything is mirrored. The mirror has a snapshot every hour on the hour. The snapshots are ALL backed up to external drives.


  • Since I reformat or re-install hardware relatively often, I don't have any fancy scripted backup software.

    About once a month, I fire up Ztree (modern Xtree) and I expand the branches of about 5 different important folders, I tag all the files and copy them to an ESATA drive. It's about 150gb of stuff which I class as important.

    When it's finished, I then copy that to another external drive and basically have 2 backups. 1TB / 2TB USB or ESATA drives are very cheap, it's worth it.

    I do actually run an Acroni

  • Do things like Dropbox count as backup? I have automatic processes like that running, but a real, thorough backup with offsite copies is a lot rarer...

  • by smash (1351)
    ... via time machine :)
  • by gig (78408)

    Macs backup hourly, automatically, if you just give them access to a backup disk.

    I also backup daily to an offsite service.

  • With btrfs, I'm taking snapshots every 5 minutes and keeping them for 2 hours, and hourly snapshots for 2 days. This was because I noticed that probably half my data recovery needs were because Firefox or Chromium got wedged and lost all my open tabs, so going back to the previous daily was ok, but it was better if I had one more recent than that. Of course, I also rsync off to a remote server daily, but the 5 minute backups solve 99% of my recovery needs.
  • Three people in the house and three laptops. One macbook and two ubuntu machines. My server runs netbsd and has a disk big enough to store our library and all three backups. My backup script runs at 0800 UT every day and usually completes within five minutes or so. I have a separate backup script which uses rsync to copy the entire /home directory tree from the server on to a portable hard disk attached to one of the laptops. The portable disk is usually stored off site.

  • Time Machine and a Drobo are the best combination. Why? Because they work to prevent data corruption and loss of disks.

    Just the other day my iPhoto library got corrupted somehow (765gb of pictures). I rolled back with the Drobo, and now it's fine. If I had a normal backup program the backup would be just as corrupted as the iPhoto library and I'd be SOL. Thanks to Time Machine and 5TB worth of Drobo I only had to deal with 12 hours of restore instead of weeks worth of library reconstruction (I keep backups

Even bytes get lonely for a little bit.


Forgot your password?