Hi Taking backups for large forums isnt something easy , so I wonder what is the best way to conserve a full backups - in external servers -for such sites, ? I found dedicated servers for backups but i wonder if this is the best option . Thanks all
I assume you are talking about how best to backup a large mysql database? Raid 1+0 hardware solution (mirrored stripe sets) would work. Or mysql replication on to another server sitting in the same rack? How big is the database?
Im talking about a Complete backup ( Mysql + Files ) site . especially about a synchronized backup to another server ..
Try offsite backup package this will surely help your need, for a 15-30 GB package normal charge is around $5-$10 per month in various companies ( rsyncpalace, webbycart, Bqbackup etc) you can get one of the package and take the backup, also in this you can define what you want to backup and what not. Using a dedicated server for backup only is costly option also need lot of maintenance, you can use a VPS instead
Get a 110mb free account. Create a cron job to gzip and ftp a backup to 110mb and you're done. Oh..and protect the 110mb directory.
If you have dedicated server for backup, I would suggest to go for CDP backup (r1soft). It is bit costly but it is worth to take important data to backed up. Kailash
Use phpMyBackup Pro. Have 3 options you can choose 1 or multi options: - Backup to an directory on local server. - Backup to other server via FTP - Send to your email Download: http://www.phpmybackuppro.net/download.php I use this and no problem. My database is 20 MB in ZIP mode
20MB in ZIP ==>> 150MB in Text. You still Use phpMyBackup Pro to backup DATABESE with >>700MB in Text. If huge database use ssh to backup
My 450mb database gzips to about 100mb. Again..setup local bash script, run cron, have it FTP to a different host. That's not acceptable imho. It's php for one. It's a whole script for two. You don't need all that. Here is my sample bash script: #!/usr/local/bin/bash # backup a single database, gzipped to a date-named folder # designed to be run twice per day (folders named am/pm) # cron something like this.. # 0 0,12 * * * /home/backups/mysql_backup.sh # database details username='USERNAME' password='PASSWORD' database='DATABASE' # main backup folder on web server home='/usr/backups/' # sub-folder for mysql backups baxdir='mysql' # if you alter this, test it. d=`date +%y.%m.%d_%p` cd $home mkdir -p $baxdir/$d echo $database /usr/sbin/mysqldump --add-drop-table --allow-keywords --databases -q -a -c -u$username -p$password $database > $baxdir/$d/$database.sql rm -f $baxdir/$d/$database.sql.gz gzip -9 $baxdir/$d/$database.sql Code (markup): Adjust to suit your needs. That script doesn't do the FTP but you can get another small script to do that and run it cron about 15-30 minutes after the other script ends. Or just incorporate it into the backup script.
I have a ~100MB gzipped DB and my best way to backup is do it manually, not automatically. In such way, you can control all the processes and results. Zip your scripts and dump your DB and that's are all basic steps you need to do. For large DB, use MySQLDumper or bigdump.
If you have a large amount of stuff to backup, I can highly recommend http://www.bqbackup.com, used in combination with RYSNC it's pretty good. Handles my 20GB tar.gz with no issues.
Hi. I have a usenet forum (based on vbulletin) that have a db of more than 14GB. Every night a backup is generated and sent to my home server throught sftp. I can post the script if you it will help you, i spent a lot of time doing it.