I have a php based site that emails me when a certain page gets hit by the google bot. Is there a way I can tell how deep the bot crawls? Is there any application for this or a log reader? I have been reviewing the pages indexed by google. We have seen quite a few changes since BD started. Now we have a bunch of supplemental results. When I look at them, a lot are cached from July of 2005. thanks for any advice.
Your server should have a record of all the bots that have spidered your site. You can trace Googlebot's progress that way.
Log analyzers will help you with that. It's nice to know that the ten hits were bots that did hit ten pages one time and not just the homepage 10 times...
do site:yoursite.com/folder and then try site:yoursite.com/folder/folder and see what results are brought back.