I have a script like this: foreach ($test as $img => $name ) { // do stuff to 10000s images } Obviously, the script cannot finish like this. The problem is that my hosting company will not let me echo stuff in realtime. The result is output all at once. So, basically, my question is, how can I run this script in these conditions? Maybe split the foreach in pieces? But I need to be able to copy the output text after each run an only then go to the next set of images. Any idea would be appreciated. Thanks!
You can try doing it like vb forums does it. Do 100 at a time, then echo output, then redirect (via js) to the second patch. I always found that annoying and broke on my many times, but that would be a solution. You can always using CLI for this instead of the browser.
I would suggest doing it offline, however if you must do it in real time you can change the time out or script execution limit, for example http://us1.php.net/manual/en/function.set-time-limit.php. All setting related to script execution can be set at run time.
You could try outputting the results to a file rather than the screen that way you could run it in blocks.
Outputting it a file will still time out as PHP will exit once it reaches the max execution time. If you already have the list of files in a file or a database, then you could do it a piece at a time.