Can someone tell me why spiders won't crawl this site deeper?

Discussion in 'Search Engine Optimization' started by M.K.Jackson, Jul 22, 2007.

  1. #1
    I'm looking at a site for a friend. At the bottom of the site are some links going to a bunch of pages with reader submissions and articles, as well as a link to a small dating directory.

    http://www.realworldseduction.com

    However, google isn't crawling the site deeper, and when I tried it with the crawl test at seomoz.org at

    http://www.seomoz.org/crawl-test/

    and I get the same results: it only crawls one page beyond the index page.

    I've replaced the robot.txt with a blank one as the original one was filled with restrictions (still have the original left, just wanted to be able to exclude it as a possible cause). Can someone tell me what I'm missing? I've crawled the site with a software called GSiteCrawler to create a sitemap - which it did fine, i.e. crawled all the pages linked from the index pages as deep as it goes - and the site map has been uploaded...but no results

    am I overlooking something :confused:
     
    M.K.Jackson, Jul 22, 2007 IP
  2. trichnosis

    trichnosis Prominent Member

    Messages:
    13,785
    Likes Received:
    333
    Best Answers:
    0
    Trophy Points:
    300
    #2
    google and the other search engines crawls deeper but it's changing according to the backlinks power . more backlinks means more deeper crawl
     
    trichnosis, Jul 22, 2007 IP
  3. M.K.Jackson

    M.K.Jackson Peon

    Messages:
    21
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #3
    I can see why googles spiders won't crawl the site very deep...the link structure was recently changed too which turned up a lot of missing pages etc (now fixed with 301s). But what worried me was the crawl test at seomoz...

    but the page needs more links pointing to the domain? It was also briefly showing up in google for a bunch of search terms but is now nowhere to be found (this is why I was asked to look at it). There were a lot of issues with duplicate content which has now been fixed with the robots.txt files. Do I need to remove pages manually from googles cache, or can I just wait?

    Thanks for the help
     
    M.K.Jackson, Jul 22, 2007 IP
  4. tomcatuk

    tomcatuk Peon

    Messages:
    20
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #4
    Got any off site links pointing to internal pages? That ought to help. Doesn't look like you have very many backlinks at all - that's the place to start.
     
    tomcatuk, Jul 22, 2007 IP
  5. oseymour

    oseymour Well-Known Member

    Messages:
    3,960
    Likes Received:
    92
    Best Answers:
    0
    Trophy Points:
    135
    #5
    get deep links pointing to your inner pages. that will help your pages get indexed.
     
    oseymour, Jul 22, 2007 IP