Hi, i have a script that runs once a month on my site which updates our database with GEO co-ordinates. This dump contains 100K rows of data. This all worked fine on my development server, however i have just found out that my hosting company not support LOAD DATA INFILE on their cloud databases for security reasons. Does anyone know if there is an alternative way to do this update as inserting the data normally, without LOAD DATA INFILE, takes about 5 minutes. Thanks in advance for your help...
can't you just create a import script yourselve? that will break the data into 1k blocks and run them 100 times?
Hi, thanks for your reply... I don't fully get the idea of 1k blocks... I guess it means that we would loop 1000 then wait a bit, then loop next 1000 and so on until 100 times is done. This would split the load but wouldn't help too much. Do you know how i can update 1k block once
Try import feature of the PHPMYADMIN (phpmyadmin.net). As far as I remember it allows import from files and can split queries into series to avoid server limitations.
Oops. In this case, do it as EricBruggema has suggested - write a script that will perform some number of queries at a time and then sleep and then perform another number of queries. OR Use ordinary "INSERT" (or UPDATE) and loop through all the rows at once. Do it once per 10 minutes, for a week. When your hosts ask why you are loading their server that much, ask to enable the "LOAD DATA INFILE"
Download http://www.ozerov.de/bigdump/ upload that to your server, test it out by running intervals of say 10'000 rows and see how you get on... after that just setup a cron job to run it on the first of each month or something.