hi, I want to generate a complete cache of web pages.I can use curl to grab pages.But that will only read the HTML part..i need a copy of images,css and others and also the relative links in the HTML changed..any suggestions for this or any free scripts available for this? thanks
Its simple to create, u can do this without curl, you can use file_get_contents, and then preg_match for getting all images,css,js, then use fwrite to save files , its simple
If this is for your own site, you can use something like APC to cache your php pages, which will greatly reduce the load on your server. http://www.php.net/apc
but there are many cases whic arise like:- href="" href='' href= href=xxx.php href=/xxx.php Code (markup): similar is the case with the img tags and all..so wont it become too complicated to find all possiblites and replace them properly as well?