i want to back up all my sites on remote server. windows and Linux. i'm looking for a script which can zip my site on server. and then break them in to small pieces (50 to 100mb). so that i can copy them thru Wget.
You can use rsync to skip a few steps and compress, transfer, uncompress all your data in one go with minimum hassle. Or, if you prefer to do a full backup, you could use tar to pipe the data through SSH to the destination: tar cz <my_folder> | ssh <backup server> "cd <backup_dir>;tar xz"
It's easier to use Linux to perform backup. As boltok suggests, you can use tar.gz to compress and download from the remote server. Windows is slightly more difficult to do so.
I'd vote for rsync too. One thing to watch with zip'ing things up is that it's very processor intensive and if 100MB is a 'small piece' you might find your server crawling along for a while. Much better to only rsync any changes over. Matt
Not sure when it was introduced but rsync3 can certainly compress files during transfer - -z, --compress compress file data during the transfer Obviously you may, or may not, want to do that as it's going to cost CPU time to do the compression. Some backup solutions allow for the data to be compressed on the target server whether it's compressed during transfer or not. Matt