Hello guys, If i would like to copy the sites content on a daily basis and put it into a database, how could i do that? I mean to copy other sites and put it into a database. toby.
Make a script that does file_get_contents($url) and sticks it in the database. Run script on a cron every 24 hours. Or use wget with a shell script.
thsnks tops, The URL is dynamic, how could i make it go and get content every URL presented in that site? or are we saying running a loop to go into each URL from the main URL? thx for the heads up.
Yeah, just grabthe page that contains the dynamic URL. As long as it's in the same spot or has other unique characteristics, you should be able to extract it from the page and then grab it. Loads of pre-made scripts to do that for you: http://www.google.co.uk/search?hl=e...&ct=result&cd=1&q=php+extract+links+from+page