Could someone please spend the time and explain to me how to download and extract a large database to my pc from the server via ssh. I have used ssh before and I use tunellor as my client , and have worked with the php.ini files before with the vi/etc But how do I download a large database with ssh Like I assume I go su - and login in as admin right? Then what? Also I am doing this for a client who has shared hosting is this going to be even allowed by the host? I know some hosts don't allow ssh. Thanks in advance
you need to copy the .sql file from server using ftp and ssh,just copy the .sql file to some folder in ftp account and using ftp download it to local system Regards Alex
Do you have ANY host that you can FTP the file to? Once it's one ftp and web accessable you can grab it using the wget or fetch command. wget http://www.domain.com/database.sql That will grab it immediately. Oh you can also gzip the file and upload it to a large dump site then grab it with a direct link.
Hey - Thanks for responding will check that out for sure Awesome sauce, Going to the big G now Thanks for the response as well
yea i think scp will be the best choice.. scp username@ipaddress:file/fullpathoffile . This will copy the file to the present directory where you are in..The dot does that i hope this is the syntax.. i have used this command a lot of times on my college LAN
Use scp and change the cipher used to get a slight performance boost: scp -c blowfish username@server:/path/to/file/on/server localpath Blowfish is a faster cipher and can be used when transferring large files.
If I were you, this is how I would do it. First, assume you have the database file "database.sql" somewhere on the server. 1) SSH into the server to access the file 2) locate that database file and compress it (important) (see here for more detailed instructions) 3) Once done, log out from server 4) Make use of Secure File Transfer Protocol to download the file Once done, you can access to the appropriate directory and download the file using "get command", e.g. That should do it! The trick is to ensure that your database file is first compressed to ensure minimal bandwidth usage.
I would assume that you do NOT have an .sql file generated, and just need to make a fresh dump for download - For that I always just use (assuming *nix): mysqldump -uUsername -p dbname | gzip > dbname.gz (replace the parts in italics with the appropriate values) then download the .gz via FTP. I mean, you DO have FTP access as well, right? (SSH without FTP access would be quite... odd) If not, nks had it right, use 'get'. When it comes time to restore it, you just reverse the process: gunzip < dbname.gz | mysql -uUsername -p dbname Oh, and it will prompt you for the password for the appropriate mySQL user name.
Hey Guys, Thanks for all the advice, What I wound up doing was exporting the sql via cpanel on a direct link dowload ( 1.7 gigs ) Then I upload the 1.6 gigs to a directory Then I used BIG dump.php with the settings at 100 rows per session and that worked awesome THANK YOU ! to everyone in this thread for helping me.