Hi rad, if you are nearly over your quota, you could use sftp the first time via the terminal (make some test before) sftp yourserverIP sftp> cd /backup/ sftp> dir -l sftp> get yourfile.tgz sftp> quit (this work very well) And after you could use rsync to just update the file who have been modified and even better duplicate the file you got by sftp (Monday, Tuesday...) and create a cron for each day (make some test before) ssh user at yourserverIP sudo -s rsync -e ssh -avz --stats --progress --delete user at yourserverIP:/Applications/webserver/ /Applications/webserver/ (last command on one line) This way you could have many backup and transfert only the file that have been changed. Man rsync in the terminal will help you with the options. FF Le 30/08/05 06:10, « Rad Craig » <rad at inductionconcepts.com> a écrit : > I am trying to setup an SFTP server to download copies of our website > backups. We make a full backup once a week, just over 2GB right now. > > My webhost builds the backup archive and will send it to a remote FTP/ > SFTP server. I need to do this because when I make a backup, it > nearly pushes me over our quota on the website and makes my mail > server act up. I have to use SFTP because of the 2GB limit of FTP. > > I would also be open to using SSH and some form of secure copy. I > know very little about UNIX, so I'm looking for some help. It > doesn't have to be completely automated, but that would be very nice, > to write a script. I have SSH access to my web host. > > So I'm looking for some help from the UNIX gurus out there to help me > accomplish this. I at least need a way to have the download sent > manually to an SFTP server running on my OSX box, or better yet, some > type of automated script to do this once a week, to fetch a backup > off of my webhost. I can probably talk to them about helping me with > a script and cron job to build the backup archive once a week. > > I have read that there is a built in SFTP server in OSX, but I have > no idea how to configure it. I tried CrushFTP today and got it to > connect and login, but when the download tried to start, I got a "550 > Access not allowed" error, then I remembered the 2GB limit for FTP, > so I am guessing that was the cause because everything else seemed to > be setup correctly, it got through my firewall, logged in > successfully, etc. So I have that part done right. > > Can someone help me please? Any help will be greatly appreciated. > > > Rad... > _______________________________________________ > X-Unix mailing list > X-Unix at listserver.themacintoshguy.com > http://listserver.themacintoshguy.com/mailman/listinfo/x-unix > > Listmom is trying to clean out his closets! Vintage Mac and random stuff: > http://search.ebay.com/_W0QQsassZmacguy1984 >