Hi, I need a professional reliable custom scraper built for a particular real estate website. It needs to be something that i can preferably run from my desktop (but alternatives considered). The scraper needs to be able to import 1000's of urls from a text/csv/tsv file for that website and then extract the same desired fields and images for each url to be exported to an excel file/databse. All urls will be in the same format. The scraper should work sort of like "web content extractor" with the flexibility to change fields in case the website style changes etc. These are the fields to be extracted: 1. Price field 2. Address field 3. No. of bedrooms field 4. Property Description 5. Agent name 6. Agent telephone number 7. Agent address 8. And then the scraper should download any images for that property to a physical folder ("made on the fly") which will have a name that corresponds to the url "code" so that there is no confusion over images, for instance for url no. 550 may have code "12345", so any and all images for that url, should be downloaded to a folder named "12345" so that the name of the folder is as the code next to the url in an excel file (or csv/tsv) which will be imported by the scraper and download any thumbnails for that specific property into this code named folder It would really help if you have done this sort of thing before, as i need this done as soon as possible. Also it would be helpful if i can customize this scraper to use for other websites. Any developers that can show a demo of an actual working program will be given preference. Thank you.