It appears there's very little information on this topic, so I thought I'd start this thread. I went in and set the maximum setting for my site [ XtremeMusic ] 7-year old domain--and will be monitoring the Google crawl rate. Here are the setting: 0.5 requests per second 2 seconds between requests Please note: This setting expires November 21st 2009. So I am not sure if I am allowed to change it or not? Anyone else have any interested in Webmaster tools and Google crawl rate? Please post your experience.
Increasing the crawl rate for any particular website will definitely increase the visits of Google Crawlers, but in most cases it won't help that much You can do the same by getting backlinks pointing to your website from HIGH PR sites, and the most important part is to have FRESH content in your website for the crawler to index I've seen a great improvements in the rate the crawlers visits my sites, based on my very own experiences of changing the Crawl Rates to the highest. Hope i was clear and helpful.
i am not agree with your points. Setting crawling rate higher consume bandwidth. So i can suggest keep crawl rate to normal try to get good backlinks.
Its better to choose option ->Let Google determine my crawl rate (recommended).Since if you increase the speed more means bandwidth problem will arise. Backlinks tips: Go for Good Quality relevant Backlinks only which will improve a lot in SERps.
Indeed, Google webmaster doing a little help for indexing my webpages but it's better to get as many backlinks as you can if you want your webpages being crawled instantly. Backlink from indexed weppages is absolutely fine (no need high PR webpages)
I think that the setting that you refer to is a technical thing, setting how fast the google bot loads a page from your site. Not how often it visits the site!
I agree with most of the response here. Let Google handle the crawling process and worry about something else. I find that Google crawls your site a lot faster if you add new content.
I have to tell you a bad news, if google suggest you change the rate, it means your server is not good
Seem to me that many think "Crawl Rate" refers to the frequency that Google accesses and crawls a site. If so, that isn't that case. Here want G says about it and it's pretty much self explanatory Note: Crawl rate refers to the speed of Googlebot's requests during the crawl process. It doesn't have any effect on how often we crawl or how deeply we crawl your URL structure.
Google didn't suggest that I should change the rate, I did it on my own as an experiment. I did notice my content is getting indexed faster. Not sure if it from this experiment or not. On a side not, my server can handle any number of spiders Google wants to send to my site Bring em on Google!
I have had my site live for almost a week now, but Google still hasn't updated the indexing http://www.htcapps.com Will submitting a sitemap help crawl rate?
It takes about 1-4 weeks for a new site to get listed, depending on a content size. Submitting a sitemap.xml is a great help for google bot to crawl your site faster. You should maintain an automatic script to update sitemap and send ping to google bot every time you change site's structure. I got this new blog http://www.ehowspace.com and it took google 2 weeks to index it. Now, crawling rate is about 1 minute, so my new posts appear in no time. Good Luck!
As Patricianian said : keep build more backlink from trust site and fresh unique content will help bot crawl your website ofent