As always I got up this morning and had a cup of tea, sat out for a bit and chilled. I came online to check my stats (urchin) for my webby to notice my bandwith usage for today was already upto 130mb. Odd I though as I normally only use 20-35mb a day. Sent off a support ticket to my host (stats are updated at 4 in the morning so had no stats for this usage) and it turns out google bot has been stuck in a loop since 7 this morning and just used up stupid amounts of BW! Anyone else ever had this problem before? JP edit: stats are: Mozilla/5.0+(compatible;+Googlebot/2.1;++http://www.google.com/bot.html) HITS :6,707 Just glad I have 25gig BW per month
maybe they are just doing a deep crawl. 100MB isn't that much for the return on traffic if you have good content indexed.
Is it actually in a loop (ie requesting the same page again and again), or is it requesting different pages? Is it a dodgy script on your site that Google is trying to index all the variables for?
According to my host it's just stuck in a loop - After sending them the initial e-mail they said "we'll sort it out" Should not be a dodgy script on my site as I don't really have any... plus it's been up and running for the past 9 months and this is the first time it's happened (but getting google bot on plenty of times)
my hosting doesn't count nonviewd traffic as bandwidth, this includes spiders...so I love it when Google takes 300 Mb in a couple days. It's interesting, because I don't have an looping scripts.
i also had loads of googlebot visits today. actually today alone google visited more pages than it did during the last month.
I created that google sitemap thing, and a few days later googlebot came by: 2 days in a row: 25k pages per day (avg size = 60kb) you do the math okay I will: 25,000 * 60,000 ~= 1.5GB in a day! google most love me (excluding /robots.txt fetches haha)
I think one of the goals of the sitemap is to prevent these types of complaints - the bots know which pages to focus on and don't reindex old pages as often. JP have you got one? Sarah
Yes, I have a site that previously was serving up a total of less than 1 GB give googlebot almost 11 GB in the first 7 days of the month. The bigest issue I have is if they are going to take this much bandwidth, they sure as heck better index the pages of my site.
Who's "them"? Google? If not, it sounds like a bug in the bot. I would shoot Google an email as well. J.D.
They are taking about 1.5 gb per day it seems. I've got a dedicated server, so I'm fine for now, but I've never seen anything quite like this. For example, in the last 5 minutes they have hit approximately 300 pages, and it is like this almost all the time. I only have about 10,000 pages on my site.