Hello I want a program which I enter a website address and the program trawls all the pages in the website and creates a datafeed. Would this be possible? I would use it to trawl a shopping site and would want details back like name, page url, text, price, image url Thanks Mally
This would be diffrent coding for each site i guess as not all sites use the same structure so you will need to provide what site is is that needs to be crawled and saved.
Agh, I was hoping to get a program that would do them all, how about if the program would look for certain fields, i.e, I would check the website and try and find out what reference the image link was, and then add that to the program, the same with the description price etc
I have a tool that will pull products from ebay. Here's the tool http://taskbarscripts.com/clients/glenn/ This tool can only be used to grab ebay products. Let me know if interested.
Hi, I've done this type of application before You won't be able to use a generic tool. You will need a specialized tool that is written to handle each site, so all sites would need to be included upfront or additional work would be required. I know how to write these types of tools and I have done so since the mid-'90s. PM me if you are interested with the sites you need and how much you are paying.