So i have a site thats about 2 years old, it had 100 pages indexed in google with about 1800 backlinks. I just recently revamped the site, locally optimizing it for every city and state in the US. As a result the site now has over 370k pages. Google decided to start a deep crawl on the site today, and has been crawling it for several hours and has hit about 1700 pages so far. When does the googlebot decide its had enough and time to leave? Or should I expect to deep crawl a huge portion of the site? My pages are HIGHLY optimized so I should see some very good results if it decides to start throwing these pages in the index.
As no one knows when will google bot crawl, the same as no one will know when i will stop crawling, maybe when you tell google bot not to (by declaring noindex,nofollow). Just always ready your site so bots will fetch good contents.
millworx im having the same issue i think, is google indexing these pages it scans? also are you using mod rewrite? i have just uploaded a website which i used mod rewrite to make the url seo friendly, since google has crawled my site it has just kept on going every five mins i get an email telling me which pages are being crawled, i use a php script that emails me. but the only page that gets indexed is the home page. now im wondering if after all those seo urls are scaned if google will then index them. I think i made a mistake and google is just going to keep scrawling all pages and not index them. im still waiting. millworx can i have the url of your website? i would like to check out your site. o and by the way do you validate your pages with w3c?
just because google crawls the pages doesn't mean that it is going to index them, but just know that it is a good thing that google is crawling your site. if you have unique content then those pages most likely will be indexed, if it's dup. content, then there is no telling. it's up to google.