I am trying to setup an SFTP server to download copies of our website backups. We make a full backup once a week, just over 2GB right now. My webhost builds the backup archive and will send it to a remote FTP/ SFTP server. I need to do this because when I make a backup, it nearly pushes me over our quota on the website and makes my mail server act up. I have to use SFTP because of the 2GB limit of FTP. I would also be open to using SSH and some form of secure copy. I know very little about UNIX, so I'm looking for some help. It doesn't have to be completely automated, but that would be very nice, to write a script. I have SSH access to my web host. So I'm looking for some help from the UNIX gurus out there to help me accomplish this. I at least need a way to have the download sent manually to an SFTP server running on my OSX box, or better yet, some type of automated script to do this once a week, to fetch a backup off of my webhost. I can probably talk to them about helping me with a script and cron job to build the backup archive once a week. I have read that there is a built in SFTP server in OSX, but I have no idea how to configure it. I tried CrushFTP today and got it to connect and login, but when the download tried to start, I got a "550 Access not allowed" error, then I remembered the 2GB limit for FTP, so I am guessing that was the cause because everything else seemed to be setup correctly, it got through my firewall, logged in successfully, etc. So I have that part done right. Can someone help me please? Any help will be greatly appreciated. Rad...