Is it possible to create a php script that will pull out selected portions of other websites or rss feeds and place them in another website while keeping them updated? Example would be to grab the upto date info in the middle of this: http://www.bchighway.com/highway-closures-news-feed and just display the info in a different form on another site?
Thats possible... If its an RSS/XML your trying to scrape: You can use the simplexml_*() function/s or the DOMDocument class. If its a website/page in general your trying to scrape: 1. You'd first grab the contents of the remote page using cURL or the file_get_contents() function (theirs a few other functions/methods such as the fsockopen() function but these are the most popular). 2. Then extract the specific content using a regex with the appropriate preg_*() function (again theirs a few other functions/methods such as HTML DOM and explode() but this is the most popular).