Ok I just finished creating a large new admin system for my company.. the information needs to be backed up on a continual basis (since it deals with sales inquiries) I'd like to set it up to download all new inquiries and all updates within the last 24 hours once per 24 hours to a local server and 2 online servers. I assume i do this though a curl script, I was just wondering if anybody knows of a system in place that can quickly accomplish this.. I would guess I'll get around 500-600 changes and new inquiries per day. Depending on how fast this could be I guess I could go down from 24 hours a day to every hour?
Set up a cron job that executes mysqldump (assuming you're using MySQL). Run 'man mysqldump' or look for the documentation on mysql's website for more info.
You'll need to write custom code to do that I think, are you wantin t replicate the dtabase on another server, or only keep the data ?
I'd have the admin system on multiple servers so it would need to update the databases on the non-main servers once per day or more often if its not a server lag. The reasoning behind this is that if one of the servers is down my sales people can just switch to a different domain. It needs to update the data on the other databases. It doesnt seem like the code would be too difficult to make if i had to code it from scratch.. All the inserts already have a field for datetime modified. It also would need to download a update sheet to my local machine once per day, and maybe make a full download once per month. Though the very last one could be manual.
Sorry I misunderstood what you want to achieve here. What you're after is not that simple if you want to code it yourself. Coding this yourself, you'd need to keep track of what was last updated to the "slave" server(s), and since the last update there may already be rows inserted, deleted, updated, or even database schema changed, etc. On top of that, there are numerous potential network and synchronisation errors. Luckily, there is already an existing system that does all the hard work for you. Do some research on "mysql replication". It gets updates from the database log files.
that sounds interesting .. i prefer to use an existing system if it exists .. database schema wont be changed though.. and i already have datetime fields on all the inserts that need to be updated, so maybe i should just make a solution myself.
I agree with phper: it's those little edge cases where people end up using separate databases at the same time that you're going to get into a world of pain. I'd either suggest using the enterprise tools that exist or maybe go the other way and virtualise your server over a number of computers...
We will only use one of the servers most of the time, and use another server if we lose the main server. It looks like I'll have to hard code everything but it shouldn't be too bad.