Hi, today when I use Google Webmaster Tools, I surprise with it's new version. In Settings , we can easily control googlebot crawl rate now. Last month, one of my site is crawl by googlebot 50K+ per day, it is very terrible. But now I can set the exact speed. And I can not set fast crawl rate for new site in the past, now I can... Very well...
Ya , its updated and u can set the crawling speed according to your updation in the site as well as when u made drastic changes in your website. its really very informative to make uyour site rank higher
There was that feature to set the crawl speed before only, wasn't there? I always set it to fast so that my updated pages get indexed quickly.
Yes, the feature is not new. But it is updated now. You can set fast, normal, slow at before and many sites especially new sites can not set fast speed. Now you can set "requests per second" , "seconds between requests" for every site. It is more exact and useful. Q: "How to set?" A: login you google webmaster tools, in "settings" , find "Crawl rate", choose "Set custom crawl rate". If you don't find these parts, It maybe google don't update your account at present. Google tools updates not effect all accounts at the same time. Some times, one of my account is updated but other one are not.
Yeah, it's an interesting update... Too bad I sold my PR4 site, I wonder if I'd be getting "sitelinks" any time soon...?
I have not turned it on or enabled it yet. I have the standard let google decided option. Mostly because it said something about it will be valid for 90 days. I am wondering if I enable it or turn it on, what kind of options, speeds, etc if any choices I will then have? And if I must then keep it at that for 90 days and it defaults back or what? I will probably play around with it one of these days. I just get paranoid. Usually I have better luck if I just don't mess with things. If they work I tell myself don't touch it. But my curiousity tends to get the best of me eventually. I'm sure this one will sooner or later. Thanks for getting me thinking.
it seems as every site is different on how fast you can set it to craw... says based on how many pages your site has... 1 of my sites has ALOT of pages and i set it to 10 request per second, at that rate google could craw 864,000 pages per day even though i doubt it will but if it does id like to see me server handle it, not sure if it can though as all pages are dynamic...