I have a local lan mysql database and I was wondering what the best way to run a query against it is. I want to set this up automatically. The best thing I could think is. the lan would push out a csv file automatically then at a certain time it would upload the csv to the webserver through a phpupload script on a cron job. Any ideas or suggestions is greatly appreciated.
Wait, what? You have a local database, and you want to query that database from the server? Why...? Nevermind why, why not just set up a cron-job on the server, and then have that server query your database?
Its not my choice its what we have to work with. Everything is setup locally and staying locally.. So thats what I have to work with.
Tell whomever is in charge the person is an idiot. As I understand it, you don't have a db on the external server / are not using said db, and the remote/external site is reading a csv file with whatever values you want to update from the local db? If so, sure, you can do it your way.
Now by calling them a idiot doesn't really solve anything. Part of it is my fault because I didn't explain the whole thing either... It comes down to cost and rebuilding the entire system which has to be around 100 gigabytes. Anyways.. The best way I know is with scheduled tasks, cron jobs, and csv's. I was thinking of some sort of mysql to mysql without all the extra junk. Thats where I was trying to focus on.
You still need an interface / script between the databases. So the solution you're proposing seems viable. Albeit... 100GB? Is this an online system? Sounds... insanely heavy. What kind of data do you have that takes up 100GB?