I have a huge site with a folder that contains around 40gb of files, i am moving servers so i need to put all that folder in an archive so i can hopefully just wget the archiev to the new server via ssh. But for some reason when ever the archive gets to about 4gb, it seems to start all over again, goes back to zero and continues to get to 4gb and then back to 0. Is it timing out or something?
since you have 40GB of data, I suggest that you use rsync instead. So that's if the connection time out, rsync will be able to pick up the last file it is been transferred over.
As tanfwc says, rsync is your man, especially if you're having weird problems with tar (I've never ever heard of any problem with tar, so its weird). Here's a commandline: rsync -avrtz <localdir> user@remotehost:/remote/dir That will recursively transfer the files and compress them on the fly.
hey, turiel, how can i use rsynce to move large host? let's say i have so many folders inside host A ?
Well, you can use wildcards like dir* or simply * for everything. Or, you can specify multiple directories on one line. eg. rsync -avrtz <localdir1> <localdir2> <localdir3> user@remotehost:/remote/dir
Yes, that should work, assuming the remote host has the same user and directory structure (i.e. /home/user/public_html is there). To be a little more sure: cd /home/user/public_html rsync -avrtz * user@ip:/home/user/public_html But its pretty much the same, probably