Disallow in robots.txt for your landing pages

Discussion in 'Google AdWords' started by corlas, Apr 5, 2010.

  1. #1
    Do you disallow crawlers in your ppc directory in order to avoid the robots to crawl duplicate content (i.e.: your thousand of PPC landing pages)?

    Thanks
     
    corlas, Apr 5, 2010 IP
  2. globalprompt@gmail.com

    globalprompt@gmail.com Active Member

    Messages:
    109
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    53
    #2
    Yes I do, I only allow for public pages
     
    globalprompt@gmail.com, Apr 6, 2010 IP
  3. KrisOlin

    KrisOlin Peon

    Messages:
    28
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #3
    Be careful with robots though. They are mostly used for NOT indexing web pages, so if used carelessly, they can do more harm than good.
     
    KrisOlin, Apr 6, 2010 IP
  4. meri0098

    meri0098 Peon

    Messages:
    36
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #4
    I never done this thing but its good to do that if you are using duplicate content.
     
    meri0098, Apr 6, 2010 IP
  5. corlas

    corlas Active Member

    Messages:
    208
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    53
    #5
    If I had a folder called ppc would this be correct?

    Thanks...
     
    corlas, Apr 6, 2010 IP
  6. RebelBetting

    RebelBetting Member

    Messages:
    13
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    36
    #6
    Yep! Spot on.
     
    RebelBetting, Apr 8, 2010 IP
  7. 4dinnovationssa

    4dinnovationssa Greenhorn

    Messages:
    50
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    16
    #7
    Ye definitely do it or your quality scores will suffer due to duplicate content.
     
    4dinnovationssa, Apr 9, 2010 IP