Got mysql table with 1000s of ULRs I want to loop through all the URL's (pages) and from each and every page extract a string that starts with XXX and ends with YYY on a line that starts with ZZZ Now I want to store the page url and the extracted string in a diffrent mysql table (same db). The string itself is an image URL so i want to fetch that image and store it on my ftp any help will be greatly appreciated
A few helpful pages: - Mysql fetch row (Creating the mysql loop) - Curl manual (Fetching html pages) - Preg_match_all (Matching the urls) - Mysql functions (The needed mysql functions) If you need more help, just ask.
Last step (got all the img-links into mysql) I also have a function, fetch_image("http://whatever.com/test.jpg") // it stores img on my ftp How do i pass the array fetched from mysql into the function?
Nevermind, got it to work as well. If anyone can point me in the right direction how to check if an image exist in a folder and if the size>0bytes and updated the DB with image_fetched=1 info I'll be a happy camper.