Hi, I have been looking of ways to back up and restore a database to and from a file. I found this but it only does a table, and i want to do an entire database. How would i adapt this to work with a entire database. Backup <?php $tableName = 'mypet'; $backupFile = 'backup/mypet.sql'; $query = "SELECT * INTO OUTFILE '$backupFile' FROM $tableName"; $result = mysql_query($query); ?> PHP: Restore <?php $tableName = 'mypet'; $backupFile = 'mypet.sql'; $query = "LOAD DATA INFILE 'backupFile' INTO TABLE $tableName"; $result = mysql_query($query); ?> PHP: Cheers, Adam
well, I suppose you could do a for loop or better yet a foreach loop for all tables, and you just store all the table names in an array
Use BigDump to restore the database (the best way, even for large databases). Use this function to download: http://www.awesomephp.com/?Tutorials*18/Downloading-a-MySQL-database-to-a-file.html Peace,
Yeah if your using Cpanel it's all ready built in. But as far as your scripts your definitely going to have to run a loop.
Never had it work for backups - why? MY BACKUPS ARE USUALLY TOO BIG. The problem with using php to create backups of any REAL data sets is you have the problem of the result being too big for php to let you download, the backup taking longer than the php maximum execution time limit, often longer than the apache connection time limit, and a whole host of other 'issues' that make doing it from php - or any other high level language for that matter - a really BAD idea. Which is why I usually end up having to resort to the command line.
navicat (commercial stand-alone app) - but has a free trial so you can test it and use it til it expires or MySQL Dumper (free php script) - uses an interesting concept that PhpMyAdmin doesnt, by dumping data in segments so it doesnt lag server much or exceeed MAX script execution time.
Breaking the processing into batches is the smart approach - I've always been suprised phpMyAdmin never actually added that capability. The only problem with the approach is that you lose index locking, so you have to 'take offline' anything that would modify the database before you back it up, or you could get corruption - making it unsuitable for things like high-traffic forums.
I hope they do impliment it in a future build. PhpMyAdmin can get nasty on high volume virtual hosts and it would really help alot .
mysqldump/pg_dump and the like are the right tools to use here. They work with lots of options, and guess what - they are made from the same guys that made the DBs, so they do the job better than others. In your example code you store the data only, not the schema. Say you have the schema in a separate query. What about if you have sequences, LO (large objects) and other stuff, that you cannot get with a simple 'select * from' query? There are lots of situations that a simple script can't handle, or handles but not well enough.. However, it's required to have SSH/telnet access to the PC, so that's what makes things not always possible, but if you have access to the pc, use them
phpMyAdmin is great for making quick changes, testing queries, viewing the structure of a table, and a few other quickies that don't come to mind at the moment. Hardly an option for creating and restoring backups though, the only exception being restoration of table structures before a data import. The MySQL commandline utilities like mysqlhotcopy and the INTO OUTFILE syntax are several times faster & after awhile even seem easier to use.