Hi. Google visits my site many times a day, but just caches it every two days or so. It used to cache me every hour a few months back. I remember that i was PR5 and that in my webmaster tools i had it set up to "High Crawl rate " or whatever, then i dropped to PR4 and that function dissapeared ( they just show the normal crawl rate). Now I'm back to PR5 and i wonder why the "High Crawl Rate" is still unavailable for me. any ideas mates? THaNKS
It is set by google, my PR was 3 and i can use this feature, so dont go with the PR and try to post daily and create few more backlinks
well im not sure with this..however in the world of SEO sometimes you are up and sometimes you are down..
The set crawl rate in Webmaster tools to a fast crawl rate is only sometimes available. I have had the option available twice in the last one and a half years. When it is set to fast crawl rate, it stays at that setting for 90 days (I think) and then automatically goes back to the normal speed. Then the option for a fast crawl rate is disabled again till Google decides to re enable it. (why they re enable it I do not know) From what I understand, the set crawl rate to fast option does not increase the number of Google bots visits to your site, it only increases the speed at which it crawls your site. (how much load it puts on the servers) So it will not make the Google bot come more often, or stay any longer, but will increase the amount of crawling or work that is done why it is there, but uses more CPU and bandwidth. That is how I understand it to work, But I may be wrong.. if anyone has any other info or thoughts on this please jump in EDIT: I have had it enabled for sites that have PR-0, so I don't think it has anything to do with PR. Cheers James
It only becomes available if google thinks it can not crawl your site fast enough for its current growth. If your site isn't growing that fast, or doesnt have much content added frequently, then there is no need for a higher crawl rate and Google knows that as well.
Hi, Besides the growth of your site, your server's performance is very important as well. If your server has speed problems and for the Googlebot takes to much time to download a page it will set the crawl speed slower and enables you the faster option. Besides that you can see a message: Googlebot has detected, that your server has speed problemm...bla...bla. This does not affect your pagerank, but affects your visitor count. Because the speed problem, google send less visitors as well to don't overload your server. This is my experience, not fiction. So keep your server in a good speed, and if you are on a shared host, make sure that the server is not overloaded, because this can make a big damage for you. Consider to optimize your pages and codes as well for faster loading. or to move to a fster host. In Google webmaster tools go to statistics-> crawl stats where you can see the average page load time. The statistic is updated every 10 days. Cheers
"We've detected that Googlebot is limiting the rate at which it crawls pages on your site to ensure it doesn't use too much of your server's resources. If your server can handle additional Googlebot traffic, we recommend that you choose Faster below." If you have quite a bit of information on your site Google will offer the faster crawl rate.. As long as your resources are capable of handling it.
I think, this should be philosophy of Google Crawler. Google Robot crawl sites according PR, Visitor Traffic, quality of site and Age of domain. Thanks Eric
I have a PR1 and I have that feature under my Webmasters control panel... I don't think it has anything to do with PR really... You might have to creat a robot.txt file..