How to Put Huge Database Dump into MySql? (250MB gZipped)

Discussion in 'MySQL' started by Nathan Malone, Mar 29, 2006.

  1. #1
    I have a MySQL database dump that is 250MB when it is gzipped, and I need to stick it in a database. The problem is, my server only has 1 GB of memory, and a Celeron 2.0 GHz processor. It has Red Hat Linux, Apache, and WHM/cPanel and it is a dedicated server, so if I crash it in the attempt, I won't be in trouble. :)

    Any suggestions for how to do this? My first inclination is to try to split it up into several chunks, and do it one chunk at a time, but I don't know how to split it up. I have copies that are both gZipped (250 MB) and non gzipped (1 Gig).

    If I had a server that was more powerful, I think that I could do it just fine, but does anyone have any suggestions for how to do it with the resources that I have available?

    Thank you!
     
    Nathan Malone, Mar 29, 2006 IP
  2. inerte

    inerte Peon

    Messages:
    78
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #2
    Calling mysql from the command line should do it.

    If not, get a file splitter and divide them into two 500MB files. Remember to check both (end of 1st, beginning of 2nd file) to see if a query haven't got divided.

    If two 500MB don't do, split them again into four 250MB files. Check the start/end, and repeat the splitting if it doesn't work until you manage to do it...
     
    inerte, Mar 29, 2006 IP
  3. Caydel

    Caydel Peon

    Messages:
    835
    Likes Received:
    47
    Best Answers:
    0
    Trophy Points:
    0
    #3
    mysql will read files in sequentially, and perform the commands as it encounters them. It does not load the entire file into memory prior to including it in the database, so I doubt it will take too much memory. Also, because it does it one line at a time, it won't require tons in the way of processing power either.

    I think you should be ok - there is nothing there that seems to me to indicate trouble. I would, however, just to be on the safe side, attempt it at the slowest part of the day for your server, when the load is at a minimum, just in case.

    Good Luck!
     
    Caydel, Mar 29, 2006 IP
  4. Nathan Malone

    Nathan Malone Well-Known Member

    Messages:
    369
    Likes Received:
    15
    Best Answers:
    0
    Trophy Points:
    110
    #4
    Thank you both for the help!

    I tried uncompressing and reading the file using the following code:

    
    $gz_file = gzopen('./database_dump.sql.gz', "r");
    $file_read = gzread($gz_file, 1000000000);
    $file_contents = explode("\n", $text);
    
    // Loop through the array, running mysql_query() on the lines if they were valid queries.
    
    Code (markup):
    And the following code:

    
    $file_content = gzfile('./database_dump.sql.gz');
    
    // Loop through the array, running mysql_query() on the lines if they were valid queries.
    
    Code (markup):
    But they both crashed the server.

    I am currently uploading a non-compressed copy of the file, and I will try that out to see if that works.

    If anyone has any suggestions for other code that might work, either on the compressed or uncompressed versions, please let me know.

    Thanks again for the help!
     
    Nathan Malone, Mar 30, 2006 IP
  5. Momo

    Momo Peon

    Messages:
    965
    Likes Received:
    18
    Best Answers:
    0
    Trophy Points:
    0
    #5
    Momo, Mar 30, 2006 IP
  6. inerte

    inerte Peon

    Messages:
    78
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #6
    Ack, don't do it inside PHP :) It's too slow and memory intensive for such huge file.

    Drop on the command line and run this:

    
    mysql -u username -p database_name < file_name.ext
    
    Code (markup):
     
    inerte, Mar 30, 2006 IP
  7. bpearson

    bpearson Peon

    Messages:
    173
    Likes Received:
    4
    Best Answers:
    0
    Trophy Points:
    0
    #7
    shell> mysqladmin create db_name
    shell> gunzip < db_name.gz | mysql db_name
    Code (markup):
    this will be tons faster to load then any php script. it will take some processing time for the gunzip part so do it when load is low.
     
    bpearson, Apr 10, 2006 IP
  8. Amilo

    Amilo Peon

    Messages:
    624
    Likes Received:
    18
    Best Answers:
    0
    Trophy Points:
    0
    #8
    I have uploaded big databases before.

    I cannot recomend BIGDUMP enough !
     
    Amilo, Apr 11, 2006 IP
  9. Mafishioso

    Mafishioso Well-Known Member

    Messages:
    199
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    103
    #9
    I am having the same problem...

    How's this Bigdump.php work actually?...
     
    Mafishioso, Jun 3, 2007 IP
  10. Mafishioso

    Mafishioso Well-Known Member

    Messages:
    199
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    103
    #10


    I've just tried this but I've got his message:

    'mysql' is not recognized as an internal or external command, operable program or bath file.

    pls help..

    :confused:
     
    Mafishioso, Jun 3, 2007 IP
  11. Golfboards

    Golfboards Peon

    Messages:
    122
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #11
    It's probably in /usr/bin/mysql or /usr/local/bin/mysql or /usr/sbin/mysql (try all, or use 'locate mysql' or 'which mysql' to see if it can find it for you).

    You also may need to set the MySQL Packet Size up a bit (I set mine to 1,024MB so I can work with some 900MB dump files) so it can accept the entire input - you don't necessarily need to stagger the input.
     
    Golfboards, Jun 3, 2007 IP
  12. Clark Kent

    Clark Kent Guest

    Messages:
    122
    Likes Received:
    11
    Best Answers:
    0
    Trophy Points:
    0
    #12
    If you don't have shell access to your mysql use mysqldumper.

    http://www.mysqldumper.de/en

    Just try it you will never use another program for backup/restore/query.
    It restores and backups with partial chunks because of that it never timeouts and you can watch the progress.

    Features:

    * Multilingual
    * Backup and restore of MySQL-Databases
    * Automatic sending of backups via Email or FTP
    * Automatic deleting of backups on the server following your own rules
    * Using a Perl script as a Cronjob
    * Administration and manipulation of databases
    * Backup of database-structures (without data)
    * Backup and restore of tables
    * Protecting your backup via ".htaccess file"
    * Reveals PHP , MySQL version and configuration details and more...!
    * Easy installation
    * Easy operating
    * Multipart-Backup for very large databases
    * Multi-database-Backup!
     
    Clark Kent, Jun 4, 2007 IP
  13. KalvinB

    KalvinB Peon

    Messages:
    2,787
    Likes Received:
    78
    Best Answers:
    0
    Trophy Points:
    0
    #13
    I've imported wikipedia dumps (several gigs) using the command line without any issue. You either have to include mysql in your path, reference mysql with an absolute path, or do everything from the mysql/bin directory. There's no progress bar so you can use PHPMyAdmin to see that tables are being created and filled. And you just have to be patient. It'll be done when it's done. There's a flag for ignoring errors as well so you don't have to restart the process 90% of the way through your file.

    When putting large dumps into a server that doesn't allow large files or shell access there are a couple things you can do.

    1) using the sql query box in phpmyadmin and copy and paste chunks of your file in

    2) write a custom script to write the sql queries you'll need to import the data into a bunch of files. You then use a second script to read in the files one at a time until the time runs out. Typically I just have it move the files as it finishes them so I can just refresh the page to continue the process.

    2 is how I got my wikipedia mirror into a godaddy account. I found out that the 3.5GB of data for the database doesn't count towards my allowed hosting space.

    I'll have to checkout mysqldumper. That may come in handy for backing up my godaddy account.
     
    KalvinB, Jun 5, 2007 IP