If the spider only loads text it will use bandwidth equal to the size of the text. 500 characters at 32bit would be 16,000 bits = 2000 bytes = 1.9Kb for 500 words but if it loads images you need to take them into account. In my experience they don't use much bandwidth, if a specific one is using lots of bandwidth then you can block it in your robots.txt file.
Depends on the spider. A good rule of thumb is to consider spiders as visitors that come to the site as often as every day & ask for a an updated copy of anything that has changed since the last time they were there, whether you have anything new, & if there's anything they might find that they shouldn't look at. Figure out how much bandwidth a visitor doing that would use & you have a good estimate for a single spider.
Sometimes it takes much of your bandwidth. Few of m websites have around 200+ MB of bandwidth spiders, while total bandwidth consumed is about 2 GB..and few others have less than 10 mb with total usage far more than this.