Most Votes
- What's the highest dollar price will Bitcoin reach in 2024? Posted on February 28th, 2024 | 8480 votes
- Will ByteDance be forced to divest TikTok Posted on March 20th, 2024 | 7615 votes
Most Comments
- What's the highest dollar price will Bitcoin reach in 2024? Posted on March 20th, 2024 | 68 comments
- Will ByteDance be forced to divest TikTok Posted on March 20th, 2024 | 20 comments
Multiple answers (Score:3)
I don't have many personal files, but emails get backed up automatically. Photographs get backed once I download them from my camera (and sometimes I even burn them onto a dvd). And other personal files often aren't important, and will only be backed up if I find the time.
-- Of course I will complain about lack of options.
Re:Multiple answers (Score:2)
When I turn on my personal PC it's backed up to my "server" using rsync, and the server backs up my PC (these computers are 150km from each other).
Rsync's hard links option makes it easy to take a backup every day of only changed files. I keep any backup that is (of the most recent five) | (less than two weeks old) | (taken on the 5th, 15th or 25th).
I also have a PC in the garage (hidden in the corner) that has its BIOS set to turn itself on on the 1st of the month and back up everything from the server (this uses the same scripts as the other two PCs). This provides an off-line backup.
Re:Multiple answers (Score:2)
In case anyone finds it useful, here's the most important bit. It's written in Zsh, you'll probably need a few extra lines in something else.
It takes a backup using Rsync, creating hard links to any files that haven't changed
# array of directories, ordered by modification time (most recent first), if none found don't error
older=( $backups/$user/*(N/om) )
rsync --archive --recursive --fuzzy --link-dest=${^older[1,20]} $user@$remotehost:/ $backups/$user/$date/
The --link-dest=${^older[1,20]} expands to --link-dest=a --link-dest=b for each entry in the $older array.
Re:Multiple answers (Score:3)
Or for those who don't want to roll their own.. there is also rsnapshot
I have multiple layers of backup. The primary machines involved are my desktop (which has a small, barely filled SSD), my 12TB file server, and a secondary "backup" file server.
A full backup of my desktop is done weekly to the file server. Key directories (containing "irreplaceable" stuff) on my file server are backed up at various intervals to one of two backup drives. Basically I leave one backup drive connected, and have the other elsewhere. I periodically switch them (probably about once a month or so)... so I always have a fairly recent backup, and an offsite backup that's probably good enough.
For a long time this was good enough, and relatively simple as the total of my irreplaceable stuff easily fit on a drive (most of my replacable stuff is rips of DVDs I own.. so they could be re-ripped, but would be a pain). As my collection of "replacable but would be a major drag" stuff increased however, I've come around to wanting a backup of everything. I achieve this by an actual second server with the same storage capacity... which is syned up twice a month.
It's not as crazy as it sounds! Seriously!
Because it's not actually used for serving files... only holding a backup.. performance doesn't really matter. Reliability doesn't really matter either... as long as it doesn't crash while recovering from a crash of my main file server, there is no problem. As a result, I was able to cheap out on the hardware for my backup server (those new green drives are great for this... ) and it ended up not really costing as much as one would think (especially when contrasted to the amount I spent on my actual file server). And it's awesome knowing you have a full backup of everything... takes the edge off when doing say, software updates, on the main file server.
Re:Multiple answers (Score:2)
I have set up an offline automated backup with a couple of friends. Here is how it works:
First off, locally I have a server that will back up on a USB drive critical stuff out of various machines on my network (using rsync). This includes SVN repo, docs such as my resume, pictures, mp3s and a snapshot of my web server HDD including a mysql dump. Videos are out for obvious reasons. I don't feel like rsyncing 100TB over my DSL line.
I have two friends that have each a linux machine on which I have an account. I have set up an rsync in the crontab of each of these machines to get my stuff out of my USB drive and rsync it on their machine. I have a dedicated partition of 1TB.
Now are they nice to me? Well, not really since they each have an account on my server and each a dedicated 1TB fs.
That way, we all three maintain our own server (always up, online 24/7 save for the occasional DSL or power failure) and we all have two offsite backups. Maintaining the server isn't an issue since we maintained it before this setup, so there is no additional work involved. Just a $60 2TB HDD on all three ends.
Re:Multiple answers (Score:2)
Heres mine, in bash/sh. use by creating an rsync include/exclude file named as fqdn.rsync, then run rsync_backup.sh fqdn.rsync. GPLV2
#!/bin/sh
PROG="$( basename $0 )"
VER=20101022001
run_backup() {
HOST="$1"
EXCLUDE_FILE="$2"
BACKUP_DIR="$3"
DATE="$( date +%Y%m%d_%H%M%S )"
rm -rf -- "$BACKUP_DIR/$HOST/tmp/"
mkdir -p -- "$BACKUP_DIR/$HOST/tmp/" &&
rsync -avhSPW --stats -e ssh --delete --delete-excluded -b \
--compress-level=9 \
--link-dest="$BACKUP_DIR/$HOST/cur/" \
--exclude-from="$EXCLUDE_FILE" "$HOST:/" "$BACKUP_DIR/$HOST/tmp/" 2>&1 | tee "$BACKUP_DIR/$HOST/$DATE.log" &&
echo "Commiting new backup..." &&
mv -v -- "$BACKUP_DIR/$HOST/tmp/" "$BACKUP_DIR/$HOST/$DATE/" &&
echo "Setting backup as current..." &&
ln -sfnv -- "$DATE" "$BACKUP_DIR/$HOST/cur" &&
return 0
echo "Error encountered!"
mv -v "$BACKUP_DIR/$HOST/$DATE.log" "$BACKUP_DIR/$HOST/$DATE.err"
return 1
}
if [ "$1" = "-chkupd" ]; then
chkupd.sh $0
exit $?
fi
while [ -n "$1" ]; do
FILE="$1"
echo "Reading config file '$FILE'"
echo "$FILE" | grep '[.]rsync$' > /dev/null 2>&1
if [ $? = 0 ]; then
HOST="$( basename "$FILE" )"
HOST="${HOST/%.rsync/}"
if [ "${FILE:1:1}" != "/" ]; then
FILE="$( pwd )/$FILE"
fi
BACKUP_DIR="$( dirname $FILE)"
echo "Backing up from host '$HOST' to '$BACKUP_DIR/$HOST/' ($FILE)"
chkhost.sh "$HOST" 22
if [ $? -eq 0 ]; then
run_backup $HOST $FILE $BACKUP_DIR
fi
echo
fi
shift
done
Re:Multiple answers (Score:2)
I used much the same until I discovered BackupPC. It uses rsync and links and compresses identical files.
It's pretty easy and flexible.
Re:Multiple answers (Score:2)
This is a clever use of that BIOS feature!
Re:Multiple answers (Score:2)
rsync and at least 10 iterations retained (up to 999 for some directory trees) on RAID 10 arrays - with offline backups (snapshots) on older drives rotated out of service from time to time.
With Terabyte drives at $50 raw, it simply does not make sense not to - and one old P4 system can do it all in about an hour each evening - 4AM or so.
Re:Multiple answers (Score:2)
That remote backup trick is quite clever. I do the same rsync setup as you (using rsnapshot), but that's mostly to protect me from myself, since the most likely need for a backup is me stupidly overwriting something. I figure most anything that takes that out will probably kill both machines, so for my offline backup I make a copy to an external hard drive and put it in the fire safe every 6 months (when I change my clocks, and check my smoke alarms). One of these days I'll have to scrounge up a spare machine and wire the garage...
Re:Multiple answers (Score:3)
Multiple answers?
You mean you usually only vote once on /. poll?
Re:Multiple answers (Score:4, Funny)
Instant? (Score:2)
Re:Instant? (Score:2)
encfs might be a way to do it. The filenames and contents are encrypted, but each file is still independent and filesizes are the same, so dropbox has enough transparency to efficiently update the files. Some would argue that this divulges too much information (someone now knows how many files you have, and how big each one is), but it is a compromise between security and enough info for dropbox to efficiently sync up.
Of course, I've never tried that. I have gpged up some files and put them on dropbox. Before anyone points out that dropbox encrypts content for you on the backend, I'd rather have it encrypted so that none of Dropbox's software ever has access to the plaintext.
Backup? (Score:2)
Re:Backup? (Score:2)
While I realize that this doesn't technically count as a backup
I think even non-technically, this does not count as a backup as the data is only stored once!
Re:Backup? (Score:2)
Indeed. What I've found works well is using Crashplan [crashplan.com] to back my data up to an eSATA disk on my desk and make an identical copy online. It takes quite a bit of time to do that, but it means that I've got a backup locally and if something should happen to that, I've got one online which can either be shipped to me or I can just download.
It's not like perfect, but the amount of important files that are likely to be lost is basically zero, it would have to take something unimaginably significant to cause me to lose those files.
Re:Backup? (Score:2)
I use older external hard drives as a backup. So I've got my current 1TB external data drive, and use my older 500GB to back up the important stuff off that. Once I get around to getting a bigger drive, the 1TB becomes the backup drive, and the 500GB goes idle.
Re:Backup? (Score:2)
I do the reverse. My newest, largest drive is the backup. Next biggest is /media which has my music and video, then the smallest is /home. As I get new larger-capacity drives, I rotate them in as the new backup and the old backup becomes /media, old /media becomes /home.
That way I can back up all of my data plus system stuff (/usr/local and /etc).
I buy the new drives in pairs and use one as a weekly off-line.
Also, rsync FTW!
Re:Backup? (Score:2)
If everything important is on only ONE drive, you will invariably lose everything one day. The fact that the drive is external actually is an aggravating factor, because it can fall down.
The principle of a backup is to have stuff stored in at least two different places. This accounts for hardware failures and user mistakes. To be safe from bigger disasters (electrical surge, fire, robbery) you need an off site backup. This is another copy of your data stored in another place (like in your parent's home for example - assuming you don't live in their basement)
Re:Backup? (Score:2)
My ISP offers a 500GB external hard drive for ~$10/month, which includes full continuous backup to their servers. Anything you put on that disk can be mirrored off-site automatically and can be accessed from anywhere through a secure website if you enable it.
I got it and haven't had to worry about backups since, all my stuff is backed up on the external drive and on my ISPs servers. That's at least three levels of backup if you include the nightly tape backups at their datacenter. Pretty sweet service for what you pay for it.
Re:Backup? (Score:2)
having your only copy on an external - no matter what it is - is silly. Today's drive prices are so low that unless your data has 0 value, you should have a second copy on a second drive.
local company near here is selling USB->SATA external bays for under $10 each, and a 1 TB drive is $49.95 at local computer store (7200RPM Seagate)
bottom line - today it is only sloth and/or ignorance that prevents anyone/everyone from having a backup solution that is offline and portable.
Re:Backup? (Score:2)
bottom line - today it is only sloth and/or ignorance that prevents anyone/everyone from having a backup solution that is offline and portable.
Or stupidity. Some people blindly refuse to take backups no matter what. I think these are the same kind of people who take pride in their ignorance.
I'm not referring to the poster above BTW.
Re:Backup? (Score:2)
Today, though, everything that I care about is on a 1TB external hard drive.
I had a 1TB external USB drive, the disk broke.
I also had a 1.5TB external USB drive, that disk broke as well.
And I a 60GB internal SSD, guess what? It failed.
I also have a box of 40 or so failed disks from various workstations and servers at work.
You can't trust a single disk with anything you want to keep. Sooner or later they fail.
Re:Backup? (Score:3)
The moral of the story is: if you want you want to keep your data safe, for God's sake don't give it to 1s44c.
Re:Backup? (Score:2)
Daily (Score:5, Informative)
Re:Daily (Score:2)
Windows backup abilities vary; Windows Server 2008 and Windows Server 2008 R2 have decent backup utilities. Windows 7 can back up with images and files in the top tier versions. The lower tier end up only saving files off.
Linux, you can use one of many solutions.
Because Windows backup programs vary, I highly recommend a third party utility. Acronis is decent, however I always push Retrospect because it has the best functionality in a backup program this side of NetBackup, TSM, or Networker. Only downside -- its backup format and transport mechanism are proprietary. Nice thing about Retrospect -- bare metal recoveries are easy -- pop in the Windows PE based recovery disk, partition, enter in passwords (if needed, but I highly recommend encrypting backups), restore.
Re:Daily (Score:2)
Re:Daily (Score:2)
They are decent solutions, but they are not magical. For most home users people , having an external drive with Time Machine coupled with a cloud backup solution for documents [1] is good enough.
[1]: I would recommend using the encryption feature the cloud provider has. For example, I use a keyfile with Mozy, and the keyfile is stored encrypted on a private Linux VPN server. This way, if I have a complete disaster, I fetch the Mozy keyfile, decrypt it with gpg, then use it to decrypt all the rest of my stuff. Since nothing resides unencrypted remotely, (in theory) someone compromising either my VPN server or Mozy would never have access to plaintext.
Re:Daily (Score:2)
the cloud is a good place to put stuff you want Big Brother to dig through - or lose because the cloud owner goes tits-up.
If you really want to store you stuff "in the cloud" then encrypt it into a RAR archive, call it some sexy name and post it to a XXX news group - it will cycle for a while at least - re-post weekly - no storage charges ;)
Re:Daily (Score:2)
Have you actually used Symantec NetBackup or EMC NetWorker?
I have and they both caused no end of pain. They are old technology at a premium price.
Rsync and a load of cheap disk is better in just about every respect. BackupPC is better at everything but you will end up with very many instances.
Re:Daily (Score:2)
Re:Daily (Score:2)
I currently have a Mac Mini Server with a 2TB external drive. Time Machine backups of the Mac Mini Server go on the drive. Additionally, with my wife and I have user accounts on the server, so our Macbook Air/Pro laptops are also "Time Machined" to the external drive thru the server. The only worries I have are: 1) the external drive simultaneously failing with one of the Time Machined Macs (not too likely?), and 2) the house catching fire (also not very likely?)
I simply don't have an off-site backup. Is that foolish? I've thought about putting an identical Mac Mini Server / 2TB setup off-site about 10 miles away at my in-laws' house, but is that really necessary? Does Time Machine work over the internet?
Re:Daily (Score:2)
Time Machine probably would not work over the internet (I guess it might if you set up a VPN, it just needs to be able to see the other system via broadcast over the network).
For what you have though, I think instead I'd set up a simple rsync script between your home computer and the remote one that backed up your TM drive. Or possiibly Super Duper or Carbon Copy Cloner would work over a drive you had mounted over the internet or pointed at a remote IP, both have free trials.
What I do is somewhat more manual, I have a TM setup exactly like yours, then I also have backups of that drive in the garage (less likely to be destroyed by a fire and/or theft) I update once a month, and then an offsite copy I swap out with the garage drive now and then, every few months.
I have been meaning to switch to a service like Crashplan though, just not sure of the best choice yet. There are many opinions which is stymying forward motion on the plan. For sure I need one that can start with sending in a drive though...
Also most code is on someplace like Github, so at least that is pretty safe.
In your case if you have a fire make sure to throw the laptops out the window as you are fleeing. The laptop will probably be damaged but the HD should be fine. Between fire and theft though, I would for sure set up something that's at least outside your home.
Re:Daily (Score:2)
I have a script nightly mirror my entire drive to a bootable copy, and then I ALSO have Mozy for all my files, because my mirror won't do me much good in case of fire / flood / break-in where they steal the computer and backup drive.
And doing those two things is not at all difficult or expensive, and combined I believe they provide me with a high level of protection.
Hard drives are so cheap these days I may also tack on a huge Time Machine volume to add a versioned backup solution.
Daily at Home, Unevenly at work (Score:2)
It's funny. I have my home machine setup to backup nightly (which are rotated into weekly and monthly logs) using rsnapshot. It operates on a similar principle as time machine; uses hardlinks to perform incremental backups but with full, live directory structure, but unlike OS X, Linux doesn't support directory hardlinks so it is somewhat less efficient. I just spent a few hours researching my options and buying a big drive on newegg, then a few more hours learning how rsnapshot worked and setting up the cron jobs the way I wanted.
But I don't systematically backup my computer at work. All the code I write gets pushed out to SVN, which is backed up, and any other files I generate that are used by others are stored on a shared network drive which is backed up. So the most important things are taken care of, however, there are still a fair number of files on my computer harddrive that it would suck to loose.
IT has a network backup system I could use, but I have to provide a charge number, and the tasks I am charging changes every week (and old ones are promptly closed) so that isn't practical. I have been meaning to get an external drive to do it myself, but I'd have to go through corporate procurement to get the drive, figure out what software I am allowed to use, and figure out how to manage encrypting an external drive (required). Oh, and I would have to convince someone to let me charge their project for all this (to which they respond; just keep our files on the network drive). None of these are insurmountable tasks, but they are enough to cause me to keep pushing it onto the backburner and move onto other tasks.
Time Machine - LOL (Score:2)
Time Machine cost me around 3 months of work around 2 years ago.
Seemed that it decided that going back 3 months and forgetting everything that had happened was a great idea.
No way I'd even use Time Machine again, let alone trust it for backups.
Multi-layered approach (Score:2)
Re:Multi-layered approach (Score:2)
I'm waiting for cheap off-planet backup.
Re:Multi-layered approach (Score:2)
Redundancy means your MTTR is almost 0 (assuming a hot pluggable drive). A RAID array will happily copy malware, delete requests, or corrupted files across the drive array.
Here is one recommended approach:
First, your RAID array, mirrored volumes for your boot partition, RAID 5 or 6 for everything else. This protects you against hard disk failure.
Second, an external drive or array which data gets copied to every so often. This protects you against controller failure, deleted files, corrupted directories, or an accidental erase. Great for bare metal restores. Not good in case of compromise or infection because the malware can zap the external drive too.
Third, a dedicated backup server (secured and locked down on the network) which starts backups. This protects you against malware (because viruses can get stored, not actively attack already existing snapshots.) Downside, it is way slow for bare metal restores in general unless you have a SAN fabric. Second downside is this box needs to be both physically secured (BitLocker, TrueCrypt, LUKS, etc.) and secured from remote attacks.
Fourth, some sort of offsite backup rotation with media that can be set read-only. This protects you against loss of everything should someone be lighting farts in bed. This also protects against malware. The ideal are tapes, but a decent drive with modern day capacity will set you back almost five digits, not to mention having a dedicated machine to drive its I/O needs.
Fifth, offsite backups for external documents. In the past month, every single offsite company has started charging for disk space. For the price increases due to data in use, I can buy a 2-3TB external drive each month. So, use the "cloud" storage providers for vital documents and definitely not for a bare metal restore of the MP3 stash machine.
Sixth, offsite places do not provide true archiving, so burn documents to optical media (I like Nero and its SecureDisk as it offers easy to use ECC writing. DVDisaster is a free/open solution for similar functionality.)
Seven, consider hard copies of the really valuable stuff (if possible). Yes, paper does cost money and takes up a lot of space, but for the stuff that is really, really valuable, it is harder for random bit rot to destroy a paper document stored properly in climate controlled conditions than it is for a DVD to oxidize to the point of unreadability.
Of course, don't forget encryption of backups (and key management.) If someone compromises a backup server, they essentially have compromised every single box the server backs up.
I can't be the only one... (Score:3)
Who needs to be able to check more than one of these options. I have an external HD that backs up every night. Then Two others that are done on a monthly basis. One of which gets swapped with a drive in a safe deposit box. As well as one that also goes in the safe deposit box on a more or less yearly basis.
Re:I can't be the only one... (Score:2)
*raises hand*
Although, I consider the one in the fire safe to be the most important, and I can't even answer for that since it's every 6 months.
Missing option (Score:2)
All my files are under version control, you insensitive clod!
For a while now, I've been using a private git repository on a VPS (~$20 month). The only things I could possibly lose are only a few days old - the latest revisions. Worst possible case: I have to reinstall OS and a few free programs and lose ~2-3 days work. Since it's a private repo I can be sloppy and commit half-working branches, which I do when a patch is growing unwieldy.
Re:Missing option (Score:2)
Do you also maintain your digital pictures, mp3s and various non-code digital documents in your GIT repo?
Re:Missing option (Score:2)
Yet another missing option...
"When it's too late."
Automatically (Score:2)
Missing Option (Score:5, Insightful)
Ineffectively
Re:Missing Option (Score:2)
Missing Option (Score:2)
Over the years (usually when something breaks or when I build a new machine) I'd go get what was then considered a large drive and copy all my stuff to it. Depending on circumstances that drive would then become my main drive and the others would go offline or get mounted in a removable tray. The last drive I bought was 1TB, and I'm looking at multiple drive external boxes.
The only problem is now I have trees of stuff like
Re:Missing Option (Score:2)
This makes me wonder why it is that there are no filesystems that dedupe on a block basis (unless one runs Solaris, or has an EMC SAN at home), or even by files. For stuff like that (which I tend to do as well), it would be nice to have the ability to shovel the junk onto the drive, and if there are more than one copies of a file, who cares.
Re:Missing Option (Score:2)
Well, you could use zfs, which gets you part way there with copy-on-write (and a variety of other awesomeness). For one good example, look at Nexanta - which is the Ubuntu userspace on a Solaris kernel with zfs support and some zfs-enhanced utils. To get all the way there, you almost need a periodic cleanup process - as just a slight offset at the beginning of a file screws up block-based detection of data which is otherwise duplicated but just not aligned to blocks the same way.
Or, not at the block level but relevant to the discussion of backups, you could use backuppc, which does a file-based duplication thing (basically stores files named based on a checksum, and the actual file becomes a hardlink to the checksum) - and stores the data compressed. So you need a good underlying filesystem that copes well with large numbers of hardlinks, like xfs. :)
Personally, I use BackupPC on Ubuntu at home. It backs up a variety of Linux boxes, a NeXT system (which honestly doesn't change enough to really justify backups - it's not like I'm upgrading NCSA Mosaic that often), windows boxes, and laptops of both varieties which are intermittently connected to the network. It's very cool, and very recommended by me. ;)
Re:Missing Option (Score:2)
...and I skipped right over the note about Solaris. :)
Missing Option (Score:5, Funny)
I don't need to, its securely on the Cloud!
Practice what you Preach (Score:2)
Re:Practice what you Preach (Score:2)
Depending upon your OS, I'd go with backblaze or crashplan, they both are quite affordable and do pretty much all the work for you. Apart from a periodic verification that it's working there's nothing to do. I give crashplan the edge just because it allows you to backup locally as well as online.
Nothing is that important... (Score:2)
So far, I would say I don't really have a single piece of data that is that important that I need to do a backup.
Re:Nothing is that important... (Score:2)
There is nothing you cannot lose, I do agree. However, seeing 20 years of family pictures vanish into the wild in an electromagnetic surge is a bit of a pain.
Music? I can live without.
Videos? Even more so.
The poems I wrote 10 years ago for my mom's birthday? Well, I'd like to keep those.
The novel I am currently writing? Ha - that one's a bummer. Lots of hard work.
The code I wrote last week? - Again, a bit of hard work.
All my digital pictures? With my kids, sister, family, vacations, etc. I'll back them up. Anytime.
Tape Changer (Score:4, Interesting)
I picked up a used 6 x 12GB tape changer (HP C1557A) when one of the local "dot coms" became a "dot bomb." I think the tape changer cost me all of about $100. I bought enough tapes to set up a six week tape cycle plus a few spares. I think the tapes cost more than the tape changer. Amanda takes care of the rest by backing up the server plus my desktop and my wife's system five days a week (not quite daily). Since 12GB isn't a whole lot of space any more, I had to set up exclusions for things like copies of DVDs, downloaded ISOs, etc.
Oh yeah, I also run RAID 1 on the server but RAID doesn't protect you from intentionally deleting the wrong file(s).
Cheers,
Dave
Re:Tape Changer (Score:2)
The best of all worlds would be to have a dedicated machine with a RAID array, where all your machines are dumped to in toto. Then from the backup server, it would just copy from the partitions to the tape drives locally what data should go onto tapes.
This provides not just tape protection of critical data, but also protects the ISOs and such as well with a pseudo D2D2T setup.
Re:Tape Changer (Score:2)
Re:Tape Changer (Score:2)
But if I back up to the external hard drive, I have to script the backup on each client plus come up with a scheme for handling incrementals vs. full backups. Amanda does all of that for me. Plus, I've been using the tape drive about 8 years. When I bought it, 100GB was a big hard drive. So far the only maintenance has been to vacuum the outside every few weeks.
Rationale for backups: Several times my wife has deleted things by accident that she then wanted back (I have too but I would just kick myself if I couldn't get something back; she really wants stuff back).
Cheers,
Dave
Re:Tape Changer (Score:2)
Personally, I don't get backups. Never do them, never needed them.
Backups are there to save your butt when you get a disk crash. Yes, that's a rare event but it does happen sometimes and it's catastrophic when it does. (Recovering a few hours of lost data is Not (usually) A Big Deal. Recovering a few years though, that's really worth it.)
Whenever I remember (Score:5, Funny)
hourly (Score:2)
Time Machine :-)
automatic incremental backup every hour. I've not yet had a total system crash, but I've used it quite a few times to restore files accidentally deleted, or even as a revision control system to revert a file to a previous state.
Backups should be fully automated and trivially easy. I never did backups for my personal stuff before due to the hassle factor.
I had a hard drive die (Score:3)
After putting in a new one and installing OS X, the installer asked if I would like to restore my Time Machine backup. With just a click it restored everything (user accounts, data and applications) exactly as it was before.
My primary drive dying literally only cost me about four hours of downtime. 20 min to replace the drive (iMac, many Torx screws, first time), about an hour to install the OS, and the rest waiting for a few hundred gigabytes to travel over USB from the backup drive.
+1 for time machine - not daily, hourly (Score:3)
Time Machine is great. I too had trouble with doing backups before I got a macbook pro, and I have lost things due to hard drive failure (nothing extremely important, but things I wish I still had) - even that wasn't enough to make me particularly diligent. I'd try one solution or another, and never end up doing more than one backup because of the hassle.
I'm not sure how well it handles backing up external hard drives, though. I store a lot of data (mainly photos, as a prolific amateur photographer) on external drives and back those up manually. Time Machine automatically excludes external drives, but I think it will do it if you tell it to. Just not sure how well that works.
In other words, in typical Apple fashion, there's one "right" way to use Time Machine, and if you fall outside that use pattern it can be difficult. However, to be fair the "right" way works quite well; I just supplement it a bit due to having a lot of stuff that has to go to external drives.
I initially store everything (including photos) to the macbook pro hard drive, which is then automatically backed up. This is convenient while traveling since I don't have to plug in an external drive to transfer photos. I use one of those small USB-powered hard drives as the TM drive, so I do bring it and do backups while traveling, I just don't have to plug it in every time. I have a second identical one at home, and do a backup to it before traveling in case my bag with computer and external drive is lost or stolen, a distinct possibility. When I fill up the MBP drive (which can happen quickly with RAW photos and HD video) I eventually do an rsync merge (doesn't delete anything on the backup drive if missing on the original) everything to my larger external HDD backup drives at home (two separate drives with identical contents). Time Machine keeps everything from the original drive until the space is needed by new stuff.
In all I feel I'm quite well covered (though my only off-site backup is a carefully curated selection of jpgs on Flickr), but it did require more than Time Machine alone for me to be fully satisfied.
Also, I have a laptop running linux acting as a server for terabytes of music and other media that's still in the state it was before TM, meaning quite limited backups.
Also, I am quite aware that similar solutions to TM are possible in linux and even in Windows, but Apple got it right and made it incredibly painless for the typical user, especially since it's built into the OS and requires no installation or even configuration at all.
Continuously. (Score:2)
Or as continuously as Time Machine runs the incrementals.
Hourly (Score:2)
Incremental backups every hour for a week, every week for 6 months, and every month for a year and a half.
Love the Time Machine. I had much worse etiquette before I hooked up an eSATA drive and it "just works". Sure, I'm perfectly capable of scripting incremental backups using rsync or something, but Time Machine actually does it about as elegantly as I would anyway because it hard-links identical files, so each backup is a full root directory. And the interface is top-notch.
daily ... (Score:4, Informative)
My backup runs every night, using cron, rsync and some shell scripts, based on http://www.mikerubel.org/computers/rsync_snapshots/ [mikerubel.org]. It writes all data on my server (mails, home directories, SVN repositiory, database, ...) to an external disk (eSATA). Hardlinks over 10 file trees for 10 days allow me to restore deleted or modified files for up to 10 days. Average backup time is less than 1 hour after the initial backup (that runs for several hours).
Workstations and laptops contain the operating systems (can be re-installed from CD/DVD/PXE), files copied from the server (local copies of mails, SVN sandboxes, ...), and some temporary files not worth to be backed up. So, I don't need to backup those machines.
Tux2000
Wife's iPhone (Score:2)
Re:Wife's iPhone (Score:2)
That's a shame. Backing up and restoring the iPhone is so easy it's a joke. With the exception of jailbroken apps, the entire phone can be brought back from a clean slate in the space of about 20 minutes - to the point where there's no reason to not wipe the phone and 'restore' to the new software instead of upgrading it, which seems to improve the performance. All there is to it is to just plug the phone into the computer a few minutes a week...
No, I'm not a fanboi, but the iPhone is the only smartphone I've had experience with (other than old Windows Mobile... ugh). I'm sure Android phones are the same way.
Only when I get a new computer (Score:2)
Only when I get a new computer and I need to transfer everything over.
I like to use things as efficient as possible and I just feel like backing up is a waste of space. So I tend to back stuff up and delete the original so it ends up being one copy anyway.
I mean what are the chances that something would brea
Missing Option: Constantly (Journaled) (Score:2)
I keep my personal files encrypted on dropbox which then shares the archive to several other machines. It updates for every change and Dropbox journals the file.
If you want 2GB free space on Dropbox plus an extra 250 MB for using my referral link, here it is: http://db.tt/cKHP3d5 [db.tt]
I don't have any other relationship to Dropbox, I'm just a happy customer.
Multiple methods are helpful (Score:2)
I have a 'personal documents' folder which fits within my Dropbox limit, so those files are on the cloud and multiple computers. The only place I use files other than from there is on my main desktop at home, which has an external drive holding a Time Machine backup (one update/day so as not to age the drive as fast) and a nightly SuperDuper backup (SuperDuper is utter win). Finally, I have a 4TB RAID6 NAS at home where my non-critical files (ripped DVDs, ripped music) is stored as well as an automatic weekly copy of the SuperDuper backup from my desktop.
Weekly automatic backups (Score:2)
Weekly and automatically. The latter is by far the most important part. Designing a system without taking human nature into account gaurantees eventual failure.
There's an app for that (Score:2)
Since "whenever I remember" is the most popular option, would anyone like a 99-cent app for their smartphone that reminds them to do backups?
Re:There's an app for that (Score:2)
Wouldn't the built-in alarm system do the job just fine? Still automated systems are far preferable, though checking to make sure they still work from time to time is wise.
crashplan (Score:2)
crashplan.com has a pretty nice piece of software (Win, Mac, Linux) that does a pretty fine job of regular backups to locally available storage or (even better) remote storage on another machine running the software. They make their money trying to sell you space on their remote server farm, but the free features work great getting data from the home to the office and back again or backing up your parents machines onto your machine and your machine to theirs. I think the free version is limited to a daily backup, but for a few bucks you can get it do do more frequent incremental runs. Mostly we use TimeMachine on Apple hardware, but for non-Apple stuff and for remote storage, CrashPlan is pretty nifty.
Continuously with IdleBackup (Score:2)
I found IdleBackup [idlebackup.nl] a while ago and fell in love with it. It's no longer being actively developed, but it works ok on Win7 x64. I tell it which directories to back up, and it saves copies of any new or updated files to my NAS automatically. My next step is to set up a remote solution with my parents' and/or friends' houses to get offsite backup as well, but this at least protects my important files from anything that could happen to them on my PC.
Twice a day, actually (Score:2)
How Linus does Backups (Score:2)
Every second (Score:2)
My main data HD is mirrored, so I back thinks up the moment I change something.
Re:Every second (Score:3)
RAID is not a backup.
instant (Score:2)
Network block devices, RAID1 with "write-mostly" option, an stunnel pipe, and a loopback-device mounte single file on a remote server. :)
And daily backups to that RAID-1 device using http://backuppc.sourceforge.net/ [sourceforge.net].
Missing option: I let my ISP take care of backups (Score:2)
http://it.slashdot.org/story/11/04/01/0354232/Zodiac-Island-Makers-Say-ISP-Worker-Wiped-an-Entire-Season [slashdot.org]
Re:Missing option: I let my ISP take care of backu (Score:2)
http://it.slashdot.org/story/11/04/01/0354232/Zodiac-Island-Makers-Say-ISP-Worker-Wiped-an-Entire-Season [slashdot.org]
That appears to be a part of the April 1st lying tradition.
Mission option (Score:2)
Meh, missing option : Continuously.
Everything is mirrored. The mirror has a snapshot every hour on the hour. The snapshots are ALL backed up to external drives.
gus
I am but a simple Windows user. (Score:2)
Since I reformat or re-install hardware relatively often, I don't have any fancy scripted backup software.
About once a month, I fire up Ztree (modern Xtree) and I expand the branches of about 5 different important folders, I tag all the files and copy them to an ESATA drive. It's about 150gb of stuff which I class as important.
When it's finished, I then copy that to another external drive and basically have 2 backups. 1TB / 2TB USB or ESATA drives are very cheap, it's worth it.
I do actually run an Acronis True Image backup of just the system partition (100gb) as well, that runs weekly and I copy that on too. I'm backing up some of the same data twice but not all of it, however that does let me just reclone my machine back quickly if something goes wrong.
I would gladly use "the cloud" as a backup as well but most of the online services are still a little pricey for the amount of data I need and in Australia, most uploads are counted towards your monthly quota. I only get 200gb of download / uploads a month and don't wish to pay more for my internet access to allow for online backups. Plus we're mostly ADSL2+ in Australia, so 8 to 20mbit down but only 1mbit up. 100kbytes a second isn't good for uploading 100gb a month. The data has damn near changed entirely by the end of that month, incremental backups or not.
It all depends... (Score:2)
Do things like Dropbox count as backup? I have automatic processes like that running, but a real, thorough backup with offsite copies is a lot rarer...
hourly (Score:2)
Hourly (Score:2)
Macs backup hourly, automatically, if you just give them access to a backup disk.
I also backup daily to an offsite service.
Only daily?!? (Score:2)
Cheap HDDs make it easy (Score:2)
Three people in the house and three laptops. One macbook and two ubuntu machines. My server runs netbsd and has a disk big enough to store our library and all three backups. My backup script runs at 0800 UT every day and usually completes within five minutes or so. I have a separate backup script which uses rsync to copy the entire /home directory tree from the server on to a portable hard disk attached to one of the laptops. The portable disk is usually stored off site.
Time Machine + Drobo: perfect protection (Score:2)
Time Machine and a Drobo are the best combination. Why? Because they work to prevent data corruption and loss of disks.
Just the other day my iPhoto library got corrupted somehow (765gb of pictures). I rolled back with the Drobo, and now it's fine. If I had a normal backup program the backup would be just as corrupted as the iPhoto library and I'd be SOL. Thanks to Time Machine and 5TB worth of Drobo I only had to deal with 12 hours of restore instead of weeks worth of library reconstruction (I keep backups of the individual pictures as well).
Thanks Apple and Drobo!
Re:Email... (Score:2)
I thought that would be the best way... until a glitch in a mail syncing program erased everything remotely.
If you need a place to put stuff, I'd probably use box.net or Dropbox, then store the data in an encrypted archive format that has some error correction ability (WinRAR and Stuffit's .sitx formats come to mind) with a very good passphrase (20+ characters, preferably 32+).
Re:Speaking of backups.... (Score:2)
I use http://www.livedrive.com/ [livedrive.com] . Unlimited backup storage for 3 PCs. You can browse the folders and open the files backed via the web or iphone. So even without using their "briefcase" service, you can have every file and folder from your main PC available to you whereever you are.
You can even build a streaming playlist for the iphone or laptop based on the music or movies you've backed up.
Re:Speaking of backups.... (Score:2)
So, if I make one of those "PCs" the backup server for my home network...
Re:Does RAID qualify here? (Score:2)
RAID isn't a backup solution, it's a High Availability solution.
Re:Backing up (Score:2)
Did they ever fix the stability problems? I've got a lot of data and I found carbonite to regularly crash and when it wasn't crashing to be very slow.