Hi everyone, This is just a noob question. I'm wondering if PHP Curl can run asynchronously or run in background? I'm developing a script for Wordpress plugin and fire when publish a post. When publish a post, it will fire another script with curl. But the other script take a very long time to finish, and Wordpress waiting for it. So i need some advice from you guys how to run it in background. so the Wordpress process won't wait until the curl process to finish. Thanks everyone.
the only reasonable to me way to solve that problem (when i had it, way back ago) was the following. 1. create a .php file which does all your CURL things 2. on the top of that file add "ignore_user_abort(true)" so that the script won't be terminated once you close the connection to it (see the next step) 3. sorry, i don't have the exact code now, but you will need to add some 7 lines of code on top of the file (after ignore abort) which will immediately terminate the connection to that script for the step 3, there was a big-time mess with ob_start, ob_flush and some other ob (output buffer) functions 4. in your main script (wordpress) simply call the curl php script by HTTP such as file_get_contents("http://www.mysite.com/curlscript.php") or you may call it with CURL. doesnt matter if you call it by curl or file_get_content, you will have to call it trough HTTP. Know that neither include nor the file_get_content(localfile.php) will work in this case .. you just HAVE to call it trough HTTP so basically you create a script which does all the CURL things you want but also it has a property to terminate any connection to itself, and also it has the property to keep doing whatever it was to do after the connection is terminated (cause by default the php scripts terminate as soon as the connection to them is terminated) again sorry that i don't have the exact script for the step 3, but try searching for "php terminate http connection" or "php force output buffer" or things like that .. it took me a while to find the portion of that code
Hi seofrall. Thank you very much for pointing me. Finally i manage to short it out. Now my plugin is working great. Thanks. For other who want to see how it works, i write the article on my blog: http://www.ivankristianto.com/web-d...p-process-upon-closed-client-connection/1438/
judging by these 2 lines ob_end_flush(); // Strange behaviour, will not work flush(); // Unless both are called ! you found exactly the few lines of code i was talking about. I remember this exact comments and they are in my codes too. however, i want to point out to a potential bug in your code on your blog (and accordingly, a bug in your description) see, after hard testing my code, i've found out that the line: curl_setopt ($ch, CURLOPT_TIMEOUT, 1); is a potential bug. the whole point of the OB mess is that the REMOTE script should terminate the connection, and not the caller (otherwise you could just use the timeout, and skip all the OB part in the remote script) the problem with the caller script timing out the connection is that sometimes (due to server overload, due to apache configs, due to... you name it) 1 second (or teoretically any amount of time) is not enough for your caller to connect to the remote script. Thus, the remote script won't get executed. you see .. if you do a 100 time test, you'll find out that sometimes it takes 0.01 sec to connect, sometimes it takes 2-3 sec to connect. Your whole point is to not make the user wait 3 sec just in case your script takes long to connect at this time. Instead, you want the remote script to send a terminate command to the caller as soon as it is accessed. So if it took 0.01 sec to connect, why make the user to wait 1 or 2 sec ? And on the other hand, if it takes 2 sec to connect, then you'll have to wait. what i did, was put the timeout to a reasonable max time (which, belive me, is greater than 1 sec) to say 3-5 sec. But i also want to note here that my remote script was not mandatory, and if it was not executed in few cases - it was OK with me (my remote script was simply checking another website for updates and downloading them if they exist, and those downloaded updates were meant to be shown to a NEXT visitor - not the current one... so you can see that my remote script was not mandatory) If you test it some 100 time, you'll be amazed how many time it goes over 1 sec. At least that was the case on my server (shared hosting server). Removing the timeout limit is not a good idea as well, cause sometimes it can even go above reasonable time limits and even bring out network error after some 30 sec (don't ask me why, ask the server admins ) bottomn line is you want to alter the timeout and make it some 3-5 sec if your remote script is not absolutely mandatory, or remove the timeout at all if your remote script is mandatory P.S. thanks for mentioning me on your blog it kinda feels cool hehe
@seoforall: Yes you are logically true about the Curl timeout. Especially for shared hosting, not all request is process below 1 second, more when the most traffic time. I'm not test it 100 times, but when i looked into apache process, yes you are right. Some of the php process took around 0.01-0.8 second. But when it is crazy traffic time it took about 1 - 1.5 second to process. I think increase it to 2 second still good to go (remember that this script is fire when user publish a post in Wordpress, make them waiting more than 2 second just publishing a post is not good too). And luckily my script is an optional. I maybe write another script for synchronize it regularly. PS: you deserve the credit mate