My site was down for a few hours. Now it's up. Google bots send 2-3 requests per second according to my log. Anyone has experience? Thanks!
Yes, i had same problem. In fact my problem rise when i saw that my bandwith usage for december was extraordinary hihg, but my visitor was not THAT high, so i start investigation and found that was my froend Google. Then i went to google webmaster tools and put my site into my account and from there i was able to change that, puting a limit to how many pages could be indexed from a period of time. Problem solved
Thanks for your replies! I'm worried about the server It's ok now, load average: 1.21, 0.96, 0.72 But there was someone copied my site with script, couples of requests per second, that made my site down. I just banned his ip, but he may change an ip and try again
I have had this before where I have received hundreds and sometimes thousands of Google bot hits a day, but like has already been said you can alter bot crawl rates from Google Webmaster Tools
I would rather let Google decide when it thinks is a relevant time to crawl. Very frequent crawls won't really help if nothing has changed on your site.
I also get 3-5 requests per second (sometimes even more), good thing I'm still hosted in Blogger, never down.
Things to do: 1. Generate sitemap.xml use free tool like http://www.xml-sitemaps.com/ 2. Don't forget to set the priority 2. Submit to google webmaster tool
Mmmm I agree .... I just received a random PR6 from no where .... o to 6 ... I have to say that you don't need a lot of incoming links to keep spiders busy, just a lot of pages and posts for blogs .... a simple Digg will get spiders to your site.