I am a bit unsure if this thread belongs here, please move it if does not. I have a problem, Google is crawling my site as a mad man. My site is caching pages on the fly and Googles crawling brings down the VPS. I know it is a long shot, but is there a way to limit Googles crawling to lets say only 30 pages per minute ? I cant block Google with robots.txt, I need the pages indexed in Google, but I need to limit Googles crawling.
In the google webmaster tool you can put the crawl rate to slow. http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=48620
Thanks, I will check this out. I dont use webmaster tools for this site and are not sure if I like to do that. But this was a good tip. Green to you.
It's really your best bet ... the only way you'll have any control over how many requests Google sends you and how quickly. Also, a virtual private server is a VPC, so if your site isn't running on actual semi-dedicated hardware, it's just going to be slow.