blog pagination pages, is noindex in robots.txt worthwhile?

Discussion in 'Search Engine Optimization' started by sirion, Sep 12, 2007.

  1. #1
    Hi there

    I've been reading that paginated entry pages on blogs can cause problems with duplicate content internally on the site. Some people suggest using robots.txt to tell the bots not to index the paginated pages, but just follow the links to the actual entry.

    From an SEO point of view is this correct? Will it reduce duplicate content? Does it improve your search ranking?

    Thanks
     
    sirion, Sep 12, 2007 IP
  2. SolomonZhang

    SolomonZhang Peon

    Messages:
    523
    Likes Received:
    4
    Best Answers:
    0
    Trophy Points:
    0
    #2
    Do you mean a static "page" instead of "post" at blog?

    Why do you have duplicate content on the "page" and the "post"? It's redundant.

    And yes, having duplicate content could harm your rank.
     
    SolomonZhang, Sep 12, 2007 IP
  3. ddover

    ddover Active Member

    Messages:
    56
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    93
    #3
    Duplicate Content will destroy your ranking. On wordpress search for 'The all in one SEO' plug in. It will help with meta data. Next add the meta tag robots and have it noindex, follow your homepage. You want the bots to only see the individual post pages. Lastly beware of your calendar and archive because they will often make the bots see duplicate content.

    If in doubt search google site:yoururl.com and if you see any supplemental results you know you have dup content issues.

    Good Luck.
     
    ddover, Sep 12, 2007 IP
  4. sirion

    sirion Peon

    Messages:
    165
    Likes Received:
    3
    Best Answers:
    0
    Trophy Points:
    0
    #4
    Thanks ddover, that helped explain it.
     
    sirion, Sep 12, 2007 IP