Do not refer me to PEAR or Curl please (or any other large libraries ;]). I'd like to know if anyone knows of or has a class that will cache a webpage and place it in a folder as needed. Thanks in advance
I can code you a class that uses a TCP socket to connect, query, and cache at a specified location. Would be in the range of 30 to 45 lines of code, would that work for you?
Lol, sure, I'm going to zip it and throw the link up. And it's a total of 35 lines, not including comments on usage. I've tested this and it works on my Linux PHP 5.2.1 I'm not sure but I think classing changes from 4 to 5. . . That and sockets need to be turned on (I think on is default. . .) Usage is simple: include('webCache.php'); $WC =& new webCacher(); $WC->cacheURL("http://www.google.com/", "test.html"); PHP: The page is then saved in the same directory as "test.html" (headers and all, though a simple function can easily be written to remove them if not wanted), if you changed the "test.html" to "../cachedPages/newCache.php" it will be saved in that directory under that file name. Simple enough I think =] Let me know if there's problems: http://bubblecoder.com/webCache.zip
Very simple way... function cache($site, $relative_location) { return file_put_contents($relative_location, file_get_contents($site), LOCK_EX); } PHP: You can use a simple if to check if it already exists.
Heh, forgot all about file_get_contents(), my mind jumped straight to sockets xD Geuss that's a more reasonable method than mine if you don't need to mod headers =]
Thanks guys but well, I probably should have been specific; im looking for something that'll save the images as well, and if possible, any links on the page that go to images (save those as well). I code myself so file_put_contents has been tried. But I appreciate it anyhow :]. I probably could write something, but tbh, I really don't want to lol.