What can i use for slicing up a large database dump file??

Discussion in 'Databases' started by articledirectory, Feb 10, 2008.

  1. #1
    I have a 395meg .sql file, i have gone down the paths of using "bigdump" and "mysqldumper" Big dump crashes randomly usually about half way through and mysqldumper gives an error right at the start, the host hasn't/wont enable ssh so my last resort would be to split up the file and import it in sections, anyone know what i can use to split up the large database file???

    Regards.
     
    articledirectory, Feb 10, 2008 IP
  2. RNK Concepts

    RNK Concepts Peon

    Messages:
    74
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #2
    Use a text editor ... maybe TextPad or PHPEditor ... just cut the file up by tables ... I have used this method in the past on several occasions.
     
    RNK Concepts, Feb 17, 2008 IP
  3. Chuckun

    Chuckun Well-Known Member

    Messages:
    1,161
    Likes Received:
    60
    Best Answers:
    2
    Trophy Points:
    150
    #3
    A bit out of context, but not really = Crimson Editor is a better text editor for coding etc, its got full advanced features, and it's a free project.. :) try it :)

    Umm, yeah - just highlight entire tables of sql and copy & paste them into separate files, naming them 1.sql, 2.sql or w/e you choose...

    Hope you succeed :)

    Chuckun
     
    Chuckun, Feb 17, 2008 IP
  4. ZooBHosT

    ZooBHosT Peon

    Messages:
    44
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #4
    hmmm, in the past I have used Zend Studio to read the file (via Zend Core's php compiler) and explode(";") into an array of SQL queries.
    Then run foreach($sqlQueries as $query){ $z=mysql_query($query); if(!$z){ echo 'error occured with :'.$query."\n";}

    What you should see then is the queries that failed (so you can manually insert if needed) or if they all fail it means there is something wrong at the source.

    Warning, this method is extremely CPU intensive!

    The other would be to get a gzip program to shrink the SQL file down in size and upload the gzipped file via phpmyadmin as a zipped text file (using the import tab i believe) Try this one http:// www. download. com/ 7-Zip/ 3000-2250_4-10045185.html?tag=lst-4
     
    ZooBHosT, Feb 17, 2008 IP
  5. LinketySplit

    LinketySplit Peon

    Messages:
    97
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #5
    If you have access to command line tools for unix or linux environments, you could use any number of tools to quickly & efficiently split up your file.

    Sed comes to mind as a good tool for this purpose.
     
    LinketySplit, Feb 17, 2008 IP
  6. kmap

    kmap Well-Known Member

    Messages:
    2,215
    Likes Received:
    29
    Best Answers:
    2
    Trophy Points:
    135
    #6
    it depends on data

    If you can edit the data after slicing then its simple .Use a normal text file splitter and then edit each slice to complete the records which are broken due to slicing

    Regards

    Alex
     
    kmap, Feb 24, 2008 IP
  7. articledirectory

    articledirectory Peon

    Messages:
    1,704
    Likes Received:
    26
    Best Answers:
    0
    Trophy Points:
    0
    #7
    Thanks,

    I managed to get the job done slicing it up using a free text editor call CONTEXT.
     
    articledirectory, Feb 24, 2008 IP
  8. RectangleMan

    RectangleMan Notable Member

    Messages:
    2,825
    Likes Received:
    132
    Best Answers:
    0
    Trophy Points:
    210
  9. mjesales

    mjesales Peon

    Messages:
    326
    Likes Received:
    16
    Best Answers:
    0
    Trophy Points:
    0
    #9
    i've used sqldumper.de - its free - it worked great. I can backup my 2gig article directory database in about 15 minutes.
     
    mjesales, Feb 26, 2008 IP