It uses some... It really depends on how many sites you're hosting and how well 'linked' you have them (that is, making it easy for GB to find its way through your site/s).
There is no doubt that Google, (and all other SE), does use some bandwidth but the usage is negligible. Furthermore what you get from Google, (and maybe other SEs), is worth it. But if you think they visit a lot more than they should you can edit your robot.txt file. Look here, http://www.google.com/webmasters/faq.html FFMG
Yes. If he doesn't, we won't see the question otherwise To answer the question above: Yes, it uses bandwidth just the same as any other file does, but the amount it uses is VERY, VERY, VERY small.
Not sure if that is entirely true. Google will use the same as any visitor, why would it be any smaller? They even request images. FFMG
As FFMG said, they use bandwidth the same as any other visitor, only difference is they go through (hopefully) all your pages to find new stuff.
As I'm rereading the thread to be clear on the subject, GoogleBot doesn't use less, but rather uses a caching technique so that each file isn't requested when it is browsed. I average about 50mb of bandwidth usage for my robots overall (including MSN, Yahoo!, GoogleBot, etc). I was thinking earlier that you were talking about robots.txt, not robots in general. The usage from robots is small in my opinion, compared to the XXgb being used by visitors through my sites.
Watch out for jeeves/teoma. This crawler started hitting my site last month and took 4 1/2 gig ouch! Gonna have to give him some rules to follow. In comparison google visits the site daily and only took a few hundred Mb in the same period.
google would use the same as a visiter but google would visit more pages than a visiter cos they can the hole site well most of it
also ask jeeves isnt index new sites at the moment so i dont realy see the point in them spidering sites as well but yes they do use alot for some reason