I have a small site (~200 pages) and I add a new page about every two weeks. There are rotating snippets on the major pages showing "hot" info. GoogleBot is visiting fairly often to refresh the known urls, but it is reluctant to pick up the new ones. (unlike the MSN bot) Is it so smart and knows that the pages are changing often but new pages are added less often and adjusts its spidering habits? Or is this the general pattern? Would it pick up URLs faster if I were adding pages more often? ---- Are you also tired of all those Jagger threads like I am?
I'll answer your question with some more questions. What PR is your site? Are your new pages linked from the main page (or whatever the most frequently spidered page is)? Do you use a sitemap?
I guess the PR of my site is fine, there are mostly PR4 pages, some have more, some less. The new page is not linked directly from the main page, but the pages with links to the new page already got spidered multiple times. The page is not buried deep within the site and there are multiple ways to reach it (root->news->THE PAGE, root->sitemap->THE PAGE, rss feed->THE PAGE, ...). I do not use the xml sitemap, because I do not like the idea. Anyway, I am more concerned about the GoogleBot "intelligence". If it uses different spidering patterns for sites with certain update patters. It may be interensing to compare our observations.