Since a few people showed interest, I'll spam the list with my solution for incremental backups of the mac in my life. The major problem I had was that most of the macs in my life are laptops. That means two primary things: 1) They go to sleep when I do. ;) 2) They roam off-network. Since I already have a server network setup at home (secure- computing.net), I simply made use of my current backup server's hard disk space, and rsync. The first step was to create an easy backup script, that was clickable for a user to manually backup their files. *side note, when I say user, I mean my beloved fiance ;)* This script contained a single line: rsync -av --delete ~/ --exclude="Caches" --exclude="*.cdr" -- exclude="*.iso" --exclude=".Trash" --exclude=".Spotlight-V100" -- exclude=".Trashes" backup-server:/usr/backups/`hostname` All the user needed to do was click on the script and it would backup any changes to their files. At this point, her hard drive crashed. Easy enough, we had backups. Well, the original backups from my testing. 6 months before. Damn. So, we had the need to automate. I tried iCal, but if the laptop was asleep for 3 days, it would try to backup three times, and we got an iCal alert for each day. My solution was to use the oft-unused (on Macs) cron daemon. I added the following entry to her crontab: 00 * * * * /Users/<username>/backup.command This, coupled with the backup.command file I'll paste below, results in her laptop backing itself up every hour, incrementally. This works great, as she's often (2-3 times a day, on average) using her laptop as the hour rolls over. Since she mostly surfs the web and checks email, the backup actually takes about 60 seconds to 2 minutes. As I have found, this script works great on my laptop as well, even though my system roams with me to Starbucks, the office, and elsewhere. That being said, my backups take about 1 minute to 5 minutes. The big thing when remote from the backup server is to have the --exclude="Caches". These, especially Safari, can get rather large. On my system, I've coupled GeekTool (http://projects.tynsoe.org/en/geektool/ ) to tail the log file from my backups, so I can keep track of them throughout the day. Without going further into GeekTool, here's the script I've got for that: <CONSOLE> echo "System Backup Log:"; tail -n 20 /Users/ecrist/.logs/ `date "+%m"`/`date "+%d-%H"`.log ; echo "\nUpdated at `date`." So, here's my minimal backup.command script: #!/bin/sh month=`date "+%m"` file=`date "+%d-%H"` # dont run if it's already running! if [ -e "/tmp/back.pid" ] then echo "Backups already running as PID `cat /tmp/back.pid`. Aborting..." else echo $$ > /tmp/back.pid ## ALL ONE LINE BETWEEN ## rsync -av --delete ~/ --exclude="Caches" --exclude="*.cdr" -- exclude="*.iso" --exclude=".Trash" --exclude=".Spotlight-V100" -- exclude=".Trashes" backup-server:/usr/backups/`hostname` > /Users/ ecrist/.logs/$month/$file.log 2>&1 ## ALL ONE LINE BETWEEN ^^ ## rm /tmp/back.pid fi echo "Finished at `date`" >> /Users/ecrist/.logs/$month/$file.log One final note. My backup server has a user account for everyone that's going to be backing up their system. Their backup directory has 0700 permissions for security's sake. Since we're backing up laptops mostly, the `hostname` part of the rsync command works fine. This could be clarified for multiuser systems by running the cron job as root on the Macs, or simply creating a new rsync command such as: rsync -av --delete ~/ <excludes> backup-server:/usr/backups/`hostname`/ `whoami` Just make sure in this case that their user belongs to a group that has rwx access to the `hostname` directory. Glad I could contribute something! If this is confusing or you need help, please don't hesitate to ask! ----- Eric F Crist Secure Computing Networks