On Mon, 2005-08-29 at 23:10 -0500, Rad Craig wrote: > I am trying to setup an SFTP server to download copies of our website > backups. We make a full backup once a week, just over 2GB right now. > > My webhost builds the backup archive and will send it to a remote FTP/ > SFTP server. I need to do this because when I make a backup, it > nearly pushes me over our quota on the website and makes my mail > server act up. I have to use SFTP because of the 2GB limit of FTP. > > I would also be open to using SSH and some form of secure copy. I > know very little about UNIX, so I'm looking for some help. It > doesn't have to be completely automated, but that would be very nice, > to write a script. I have SSH access to my web host. > I'm not sure if I understand your problem correctly, but you can set up a cron job on your webhost to scp the backup file (I'm assuming that its just a tar.gz file) to your OSX box: man crontab;crontab -e And use a variant of the below command to tranfer the files to your OSX box: scp backup.tar.gz user at osxbox:/backups You'll have to keep ssh running on your OSX box, though. You can find it under system preferences->sharing HTH Regards, Sharninder