Hi, Had a total horror story 2 weeks ago. My web company i use to manage a very big web site deleted my whole database (1gb+) as much as i was pi** off i thought ok just get the back up installed and hopefully the damage/loss of data will be minimum. Called the hosting company ... PLEASE PLEASE reinstall my database its MEGA urgent. Reply: no problem mr smith ... will call you when its reinstalled. 2 hours later i get a call .... errr we cant find you back up from yesterday ? I asked them what was the date of the latest backup and they say ... 'All back ups have been over written' ????????????? I LOST EVERYTHING. Finally we are back up and running but with less cash im my pockets. What and how should i prevent this from ever happening again. Of course the host company are now backing up every 24 hours but i still want more security ... Help ... Anyone else use databases of this size ... HELLLPPPPPPPPPP .
Well, I don't see how in the world someone deletes your database by accident, they must of been snooping around in your account or something. Second, I would advice you to find a new host just incase that so happens again. To answer your question, What I would do is get another hosting account and use it to keep your own backups on it - storing using ftp is a good idea. Then again, most webhosting companies are not directly responsible for not having a backup of your files. But you need to backup your things on your own sometimes because you never know what can ever happen one second later.
Write a script or source it out from the internet that will be able to pull your data out from the server and dump it somewhere else. If you have shell account, that will be even better. You can just hit a few command and the system will backup and dump the data in another server. A offsite backup copy is very important to prevent such case like yours. If you need any solutions install, you can approach me. I plan for my company data recovery solution.
Thanks for the advice ... I have tried writing a php script to dump the database onto another server but because the database is so large (almost 1gb) it just fails. As for having a offsite back up i totally agree but my database is changed on almost a minute by minute status. ???????????????
Make a cron job to do the backup via a shell script, not php. Make a second shell script to push the DB backup maybe an hour or two later as a cron job as well.
Yes my DB changes every minute or less as well. I have a script that backs mine up to another server that I have. If you have root access you can always do a sql dump or set up a cron to run and do a backup at specific intervals.
I handle my own backups and archives. The first thing you've got to plan for is the difference between backups and archives - archives are a history, backups less so. You also should consider offsite vs. onsite solutions. First, with a 1gig file your first line of defence will be onsite. FIrst possibility: raid 1. This is two hardrives on your server that are identical. You may not have access to this if you're on shared hosting. There are also some issues with the idea that if one hard drive crashes you may end up with two bad copies. Second possibility: set up a cron job. Every hour/couple of hours make an automated copy of the database to another file locally. Then if something crashes you only lose a maximum of a couple of hours. And restoring should be just a matter of copying one file overtop of the bad data. Third option (in conjunction with #2): Offsite backups. Here's what I do.....setup an old computer with 2 or 3 large hard drives at home or at the office. Each evening in the early hours of the morning I have a cron job that copies all my important files from my webserver to my home backup computer on one drive. When that's done each night, my home computer runs a cron job that compresses the most recent backup and copies it to another hard drive on the same computer. Then once a week I burn a copy of the compressed file to dvd and delete that week's files. You can read up on the 'rsync' linux command so that only changed files are copied. Then once in a while, check your home computer to make sure everything's backing up correctly. As long as you've got a high speed connection at home and can handle a gig of transfer at night this seems to work well for me. I end up with local backups so that I've got easy access to files quickly. If that fails, I've got the data from last night -offsite and away from any webhosting company. Using all of that you end up