Googlebot- [833,915 - hits] [3.77 GB] bandwidth - 24 Sep 2007 - 20:19 This is what I mean. Googlebot has ate up 3gb of my bandwidth just this month. Ouch! I do have 4000gb worth of traffic, but I think 3gb is a bit excessive for it to use. I could see using 1gb during the month, but 3 gb!! How do I stop google from using so much bandwidth without loosing listings in google search?
umm, mabey by using the webmasters tools? www.google.com/webmasters its very usefull, and theres an option to choose how fast it crawls your site and stuff
This is called stepping over dollars to pick up dimes. you're worried about 3gb of traffic when you have 4 terabytes available? if you're having trouble justifying that amount of traffic and your income from the remaining traffic isn't making you a ton more you need to rethink what you're doing.
Why not find a better server then? You obviously don't want google to list your site if you're worried about bandwidth.
As Gamerz7 already pointed you can use Google Webmaster Tools (from your Google account) to decrease the crawling rate of your website. I personally think that this will NOT be an wise idea. Let G to index as much as your website it can.
Thanks everyone, but I do not want to really loose listings. I think I will try to slow down the crawl. My big issues is I think it is hitting the same web pages over and over again. The information on the pages are static and I don't know why it must do that. With 4000GB of bandwidth I know I should not be worried about it, but I have nearly 200 sites that I host and just in case this starts to happen to each site I would be ending up with 3*200 = 600 GB in just google bandwidth being used and not being used for my users. I am not stepping over dollars to get to dimes. I am just trying to watch my penny's. Every little bit helps. The site in question makes enough to support the entire hosting account anyways, plus the other money from the other domains too. So, the best solution was just to slow down the crawl of each site. Thanks!!
You can add a crawl delay in your robots.txt file, this is the best option as you have alot more control over it than the webmaster tools "slow" option.