A little drawn out, but... I know my website has 1000+ different pages, due to a vast amount of products carried by the company. Trying to verify an exact number without actually counting, I tested some software that scans your computer for Analytics code (because our developer assures me that the code is on all of the pages) and it only counted ~300 pages. I realize that this was only an estimate for cost to actually run the scan, but it looks like a rather large deviation IMO. The question is, do the Google Spiders know that there are more than 300 (400/500/etc) pages to crawl through? What about pages in a multi-"step" process like a checkout that actually have the same URL throughout?
Googlebot does'nt know anything. It simply follows links on the pages it encounters. If there is no link to a page, or if it is available by search only, Googlebot will not find it.
I was informed that a site built entirely out of Flash has trouble getting indexed in the organic search because the spiders can't find these links. The suggested way around it was to give each page a unique title... or to put bits of HTML on the pages...
You need to have links to each page. You can make directory with 50 to 75 pages listed for each page. I would make each one certain topic related to the pages. Flash and javascript hurt some sites.