Anyone know a fast way to transfer 300gb worth of files to a new server? I'm currently using scp -rv user1@xx.xx.xx.xx:/home/user1/public_html/myfiles/ /home/user1/public_html/myfiles/ Code (markup): Does scp support resume? cause there's a high chance that my connection will drop
use rsync instead of scp, rsync can continue broken transfer syntax as follows rsync --partial --progress --rsh=ssh local_file or directory user@host:remote_file_or_directory Code (markup):
Is there a faster way to transfer? cause I have like 1.5million files and each file takes 1 sec to transfer so if I calculated correctly it's going to take 17days
An option would be --compress-level=1 and -z with rsync. This will compress the files with gzip and make the transfer so it`ll be much faster. syntax as follows rsync --partial --progress -z --compress-level=1 --rsh=ssh local_file or directory user@host:remote_file_or_directory Code (markup): Note that rsync compresses files in chunks, if you`d like to compress manually with gzip then you can do it but i suggest rsync with -z option.
If I'm compressing wouldn't I need to double my storage space? since I only have 10gb space left. Also I tried using that command earlier and it didn't work it just says "skipping directory /home/deliciousmanga/public_html/wp-content/manga/."
The best way it to use rsync as it will start copying files from where it left in case the file copy gets disconnected. Use the following command on the Destination server rsync -aHv --progress -p "ssh -e 22" :/home/user1/public_html/myfiles/* /home/user1/public_html/myfiles/ You can execute the command under "screen" so even if the SSH session gets disconnected, the command will continue to run and you can again attach to the screen to view the current status of it.