block pages in robots.txt

Discussion in 'robots.txt' started by login, Nov 2, 2007.

  1. #1
    What pages should I block in robots.txt in wordpress and IPB to avoid duplicate pages in google?
     
    login, Nov 2, 2007 IP
  2. webgk.com

    webgk.com Peon

    Messages:
    96
    Likes Received:
    3
    Best Answers:
    0
    Trophy Points:
    0
    #2
    Are you sure, you have duplicate pages in wordpress?

    Where did you find that?
     
    webgk.com, Nov 2, 2007 IP
  3. Monty

    Monty Peon

    Messages:
    1,363
    Likes Received:
    132
    Best Answers:
    0
    Trophy Points:
    0
    #3
    For WordPress, you could use the Robots Meta plugin , highly customizable and efficient to avoid duplicate content issues.

    Don't now about IPB
     
    Monty, Nov 2, 2007 IP