Hypothetically, if you had a site that scraped around a thousand pages a day, could a basic hosting plan handle it? I have no idea how severe the load is on a shared server.
Probably not. You'd need to tell the script not to time out, and that would instintively use as much system resources as it could (to get it done fast), you'd most likely get kicked off. I would suggest getting your own VPS or just run the scraper script off a localhost server, then upload the database/xml data (Whatever storage method) to your engine.
I think you should be OK, although it really depends of what you do with the data once you've scraped it. if you're putting it into a DB then there will be lots of DB writes, which will probably throw up a red flag. Scraping itself isn't very intensive.