Is there a way I can limit Google's bot visits? Now it hits me every 5 seconds and causes quite good usage on my account Thanks
this has been going on for the last month so it does not pass my server crashes from time to time because of it
If you have a sitemaps account with google you can go into the options and request a slow crawl rate.
Googlebot supports the crawl-delay tag if I'm not mistaken: User-agent: googlebot Crawl-delay: 180 Code (robots.txt):
Could this work? <meta name="revisit-after" content="7 days"> <meta name="robots" content="index, follow">
it's not the revisit rate that bothers me just the new pages visiting the site now has 170 000 pages in G and it ads more and more
Go here: http://www.google.com/webmasters/ Once you take ownership of your site, you can tell Google what crawlrate to use: Slow, Normal, Fast (if they enable it for you).
Personally, I like it when the Googlebot visits. It can visit as many times as it wants, even drink my beer, sleep with my wife, whatever!
seems like this would be the only was as the bot does not even read my robots file and does not delay the visits I hope the verification will be back working soon Added: so verification is back. let's see if the setting realy works the interesting thing is the message I got when changed the rate to slow: This site is currently set at a Slower crawl rate. This rate will return to Normal on Apr 11, 2007. I do not want it to return to normal. Why would they do it auto?
I stand corrected - I found this page on Matt's site http://www.mattcutts.com/blog/googlebot-keep-out/ where he specifically states that Googlebot does not support the crawl-delay command. I'm not the only person who was fooled though - http://slashdot.org/robots.txt
the sitemaps option did not make things better now Gogglebot is on the site every 2-3 seconds any more ideas how to limit the bot without banning it?
I have tried to search a lot for the solution for this problem. Have the same problem with my sites and i have still not found any solution. If anyone has an solution that would be greatly appriciated.
Is this possibly linked to Matt Cutts stating that the cache for supplemental index would be updated more frequently? I feel it could be a related issue and that the main googlebot would observe the slower crawl rate requested through webmaster tools/google sitemap - but the supplemental bot isn't paying attention to the request? Without knowing much, it is a possibility.