On one of my sites, I just noticed that between midnight on the 22nd and 7pm on the 24th Gbot hit 32,003 times. That's an average of 477 hits/hour for 67 hours straight. This is much higher than normal for my site. Often times such activity acts as the herald of some sort of update or rollout. Anyone else with a site large enough to notice see anything...? -Michael
Google is always crawling. It crawles over 30000 pages per day on our site I actually noticed that it is when it takes a *break* and stops for 20-30 minutes that the updates actually happen and new pages are added to the index.
my main site which is pr 5 is getting more googlebot hits than the last week. i have noticed the same think
I've got a site with about 5,000 pages (not a forum), but nothing on par with what you must have for that many hits. How many times did G hit the same page? And are you including subdomains? No matter how many pages are actually on your site, 477 hits per hour seems like a lot of spidering. But it may be normal for large and high PR sites?
very lucky indeed. For my site, Google seems to have been ignoring it for 2 weeks now since I last updated it. There's robots.txt and sitemap.xml, but I haven't seen changes yet.
There are 100,004 pages for the www version of the domain, plus 5 subdomains. I was just talking about the main domain. It's not that high of a PR, which is why yes, this is an unusual amount of activity. It does happen several times a year, and usually before some sort of update, but you can never tell exactly what it is that is going to get updated. However, with this spidering I have also dipped significantly in non-supp pages compared to what I had recently. I am down to about 10% of my pages being non-supp on the main domain, whereas yesterday it was 42%. I am seriously hoping that this means that update in the works is a data refresh, and the missing pages will be replaced with fresh copies. -Michael
I have also seen a rise in Googlebot's activity from last week. My hits have gone up almost 4-5 times.(From 2000 a day to almost 10000 a day)
I have one site that has 120k dyanamic pages and another that has about 10k they are both driven from datafeeds and yes this last 3-4 days I have seen substancial increased activity. Two other sites that have just static pages of under 300 pages each have not seen much of an increase. These sites are just affiliate links not from datafeeds. All sites have about same PR3-4.
I've noticed the increase crawling. But I'm not sure whether it's because of increased links or what.
I've seen an increase. For my sites which change less frequently I noticed only a small difference. But for my more active sites I see a much bigger increase.
I've noticed as well, one site in particular had 12MB consumed from Google over night. This is a PR0 site that normally gets about 600KB every 30 hours, so yes definately a deep crawl. I'm not seeing any change in SERPs or indexed pages from it yet.
I noticed that my googlebot activity was way down and then went nuts for about two days and now seems to be back to normal. they must have finally cleaned out their index!
I'm still hoping they will push out and connect the dots on all of those new caches. My non-supps have dropped lika a rock, from 42k pages down to 6k on the main domain. Rankings are all the same, but there is some auxillary lesser searched on terms that I get traffic for when more pages are non-supp. Not that it's a huge difference, but the upward trend it was going before this was nice in theory anyways. -Michael
Some advice, please? I use StatCounter for my statistics. Can anyone advise me where in StatCounter I can track the googlebot activity. Thanks for any help.
I think StatsCounter uses image or script tracking, if I'm not mistaken. That won't track your bot traffic. You need to access your log files or whatever your host is providing for log/traffic analysis in order to see what Gbot (or any bot) traffic looks like. -Michael