Google Spiders

Discussion in 'Google' started by atrain2442, Sep 11, 2007.

  1. #1
    A little drawn out, but...

    I know my website has 1000+ different pages, due to a vast amount of products carried by the company. Trying to verify an exact number without actually counting, I tested some software that scans your computer for Analytics code (because our developer assures me that the code is on all of the pages) and it only counted ~300 pages. I realize that this was only an estimate for cost to actually run the scan, but it looks like a rather large deviation IMO. The question is, do the Google Spiders know that there are more than 300 (400/500/etc) pages to crawl through? What about pages in a multi-"step" process like a checkout that actually have the same URL throughout?
     
    atrain2442, Sep 11, 2007 IP
  2. speda1

    speda1 Well-Known Member

    Messages:
    374
    Likes Received:
    10
    Best Answers:
    0
    Trophy Points:
    108
    #2
    Googlebot does'nt know anything. It simply follows links on the pages it encounters. If there is no link to a page, or if it is available by search only, Googlebot will not find it.
     
    speda1, Sep 11, 2007 IP
  3. atrain2442

    atrain2442 Peon

    Messages:
    165
    Likes Received:
    3
    Best Answers:
    0
    Trophy Points:
    0
    #3
    I was informed that a site built entirely out of Flash has trouble getting indexed in the organic search because the spiders can't find these links. The suggested way around it was to give each page a unique title... or to put bits of HTML on the pages...
     
    atrain2442, Sep 11, 2007 IP
  4. rcj662

    rcj662 Guest

    Messages:
    4,403
    Likes Received:
    97
    Best Answers:
    0
    Trophy Points:
    0
    #4
    You need to have links to each page. You can make directory with 50 to 75 pages listed for each page. I would make each one certain topic related to the pages.

    Flash and javascript hurt some sites.
     
    rcj662, Sep 11, 2007 IP