Do remember to make backups of your databases on a regular basis and make sure your backups actually works. Yesterday I managed to drop the wrong table while doing maintenance and my recent backups proved to be corrupted zip files (and they were 190 kb in size instead of the 19 Mb's the should be). The most recent working backup of the database was from October 30 and the dropped table was about 9,000 rows smaller than it was before the drop command. I emailed my web host and asked if they could roll back to an earlier backup of my db and, well, they said no. If they had the time they could perhaps find my lost db for $130 per hour. So I guess it's up to me now to find out how much three months of non-profit work is worth for me in $$. Life just sucks sometimes. The reason for me doing maintenance in the first place was because my hosting company upgraded the mysql server and everything was going dead slow after that. I thought I'd some optimization before complaining to the host. And well.... nothing optimizes a database more than a couple of dropped tables, does it?
Unlucky Bezzen :-( I did that once on a customers site, lucky for me he had taken a backup and emailed it to me just before I started work on the project. I feel for you.
A software upgrade badly corrupted my website application and data. It was irreversible. I panicked. Then I remembered that I had a current backup of the entire website (application and data) on my flash drive. That saved two years of work.
I backup my database every 6 hours to the server (via crontab), each one gets a new filename with the date and time and gets gzipped. On top of that I do incremental backups of the full site every now and then. Here is the db backup script i use: #!/bin/sh TheDate=`date +%F-%H%M` mysqldump --opt --no-create-db --user=myusername --password=mypassword databasename > ./database/databasename-$TheDate.sql gzip -9 ./database/databasename-$TheDate.sql Code (markup): I guess you could write a curl script that fetches the file from your server in regular intervals, but I am too lazy for that. I had my share of mishaps too, so thats why I implemented this.
The host I had the problem on doesn't allow cron jobs unfortunately. I used to have a cron job backing up on my old host. But come to think of it, couldn't I crontab a php script at another host that regularly fetches a backup script at my cron-disabled host?
I back up regularly using the software for the particular database that I am using and so I back up phpbb from within the phpbb control panel and the same with my directory. I have tried doing it in the mysql table, but it never seems to work and so I use the software instead.
Yes, if you write the backup script in php (the example above is a shell script). ultimatehandyman: depends on the software you use and the size of the database. My database is too large to be read in through phpmyadmin, while a simple mysql command line script does the job in a couple of seconds. For backup a mysqldump is also way quicker and can be automated easier.
Does anyone have a good mysql dump scrit? That will then export via FTP to as many differen servers as possible? This would save problems in future.
Something to get you started: <?PHP $User = ""; $Password = ""; $Database = ""; $File = ""; $Results = shell_exec( "mysqldump --allow-keywords --opt -u$User -p$Password $Database > $File"); ?> PHP: You can probably find an automatic ftp script or ftp tool somewhere else
I hate to bring up an old topic, but I just figured out how to do this right, so here it goes: I backup the databases with a unix shell script every 6 hours. I bought a cheap Computer on Ebay for $55 (500MHz, 128MB, 13GB, CDROM), you can see it here: The Sarge Then I followed this really excellent tutorial that I found here: http://www.howtoforge.com/mirroring_with_rsync To rsync my database backup folder with the linux machine at home and to rsync some of the files under public_html with the local machine. The advantage is, that I can rsync my public_html and it will ONLY copy files that are new, so even though I have quite a bit of stuff on my accounts, the amount of traffic due to incremental backups is extremely low. The two machines authenticate over a 2048 bit rsa key, so I am less worried about security. Total cost for piece of mind: $55+$25shipping + $40(debian book) I thought you might enjoy the simplicity. No need for fancy ftp scripts.
andre75, Thanks for coming back and showing us your new set up, pretty cool way of doing things. I might have a rethink about how I am backing up as your method would save me a lot of time by the looks of it.
I remember this infomercial about one of these grilling ovens "you set it and forget it". Thats pretty much the strength of my backup solution. If I am at home or not, If I am out on vacation or not, the backups run. I can even login through a remote shell on my linux machine (putty can be downloaded even in an internet cafee) and restore the database or any file that has been compromised. Didn't even cost much.
Also, check these guys out for remote storage. No point in backing up and leaving your info right next door to the original copy http:// mozy.com/mozy/overview
What are you talking about? My backup goes straight to my $55 Linux machine that runs 24/7 at home (3000 miles away from my web server). I believe this is by far the cheapest solution and its just as safe (the chance of the web server AND my linux machine being compromised is close to 0)
Good for you! What I am talking about is that *there are some people out there* who make backups and then place them on a shelf beside the computer. Mozy is a free solution (Up to 2 gig I believe) which although *you* obviously do not need some people might find helpful. Not everyone has a $55 Linux box running at home 3000 miles from their webserver which they can copy files onto........ or perhaps I am wrong
Um, my apologies. I didn't check it properly. I thought its another plug for another service. Didn't know they were free On the other hand, I wouldn't give my data (which includes the user base) away. A big issue is that users trust you. However I agree with you that you should keep backups physically separated. I actually have my pictures on an external hdd that I take with me to my work place in case I have a fire at home or something that destroys all data. Its often overlooked but very important.
No worries I agree wholeheartedly re sending off private data etc but for bits of code and other non private data it's better than nothing.