Lately I've been getting less and less hits from Googlebot, I had a script that would cache database queries, and refresh that cache if it was older than 15 minutes when the page was hit, this was intended to take a little load off the DB and make the majority of pages load a lot faster. What I noticed, (a little late, since I've dropped pages like mad) was that when it rebuilt the cache, there was some lag in the time it takes to serve the page. I started watching it, and sure enough, Googlebot would be happily crawling 2-3 pages per second, and as soon as the cache was rebuilt and there was lag, it would just stop, and go away for hours. Now, this was maybe, a second, lag. But the times corresponded to exactly when the cache file was written that google would stop crawling. So, just FYI, from what I see google bot does not like to be hitting pages at a certain speed then get hit with some lag. I'd be interested to know if anyone else has seen this, because this could mean a lot for people on slower hosts, or hosts that have a lot of lag hiccups. I assume this is to prevent googlebot from bogging your server, however, I think most of us would like Google to go through pages as fast as possible. I've often wonder why google does not support crawl-delay in robots.txt, or why crawl-delay isn't in microseconds. It would be nice to be able to tune that finely, and tell the bots to hit you as hard as they want, maybe even during specific times of the day. At any rate, I've got the caching script turned off now, so I'll post back if Googlebot returns in force in the next few days. But I find this very disturbing, since for most of us, the latency between us and google is far out of our control.