I want to know, how to crawl URLs from one of my websites using PHP functions. Is there any pre-defined functions/methods to get them for my links collection database. If know any FreeWare packages bulit using PHP can make this for me? anyone pls help ???
Look into curl. It can fetch webpages for you and then you can use PHP to parse the html for any links in it. For the parsing you can use regular expressions.
Sphider offers a search engine that you can easily set up to crawl a site and at least extract the keywords and URLs. You can download it at http://www.sphider.eu. There is also a version available for crawling a website with PHP behind a password (http://www.phpcodester.com/2011/04/using-sphider-to-crawl-a-password-protected-site) if you are interested in doing that.
use curl to download the page and return it as an object. then use something like preg_match_all('@^(?:href =\")?([^\"/]+)@i') to parse the page and return the link. (i escaped the double quotes in my example, I'm not sure if that's right. also, the space may need to be encoded) but you get the idea.