I have a master php file that executes a another php file on a remote server via file_get_contents($url), where $url is the path to the other remote php file. I have a database with details of all the remote urls and I want to have them executed at a random, or specif interval that I can set (say every 12 hours, or 15 minutes etc) So when I run the master php file, I can specify the interval and it will execute the urls accordingly. Is there a way I can do this within the master php file? I know I could set a cron job up to run the master file every xxx minutes or whatever, but can I do that from within the php file? Ideas?
Well... you could run the PHP-file and add a header() pointing to the file itself, and have it run every... say 15 minutes, or 5 minutes, or whatever you want - you would have to add a timecheck for each of the files you want to run though - eg. if you want oneto run every 15 minutes, and the header refresh is used every 15 minutes, it wouldnt need any configuration - but if you want one to only run at set intervals (say every 12 hours, you would have to make a separate control for that). Or, you can use AJAX together with PHP to run the script continously, and check for time and such.
Thanks for the reply. I was also thinking maybe using sleep(). The interval I specify might vary. Say it is 48 hours (2880 mins). I would be dividing that interval by the number of URLs in the database as that would probably increase over time. So say I have 100 urls in the db, I would want to execute each URL every 28.8 minutes (2880/100). So the interval has to be able to be specified via the user, maybe then added to the database. When the script runs it checks the db for the interval then counts the urls, then works out the interval and then we can use 'some method' to pause the script every interval seconds before it runs the next URL. What's the best way in your opinion?