Duplicate Pages Issue.. ?

Discussion in 'Search Engine Optimization' started by tommseo, Nov 24, 2009.

  1. #1
    My website is dynamic and two or more pages exits with the same content but different urls.
    How to solve this issue ...
    Should i block each and every individual url with robots.txt?
    Please tell me if there is another way. ?
     
    tommseo, Nov 24, 2009 IP
  2. sherone

    sherone Well-Known Member

    Messages:
    1,539
    Likes Received:
    16
    Best Answers:
    0
    Trophy Points:
    130
    #2
    Write some fresh content for your site.
     
    sherone, Nov 24, 2009 IP
  3. chaucd20

    chaucd20 Peon

    Messages:
    2
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #3
    Use redirect 301 to direct one page to other
     
    chaucd20, Nov 24, 2009 IP
  4. HarryJackson

    HarryJackson Peon

    Messages:
    181
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #4
    write fresh content for at least one page, duplicate content may cause penalization
     
    HarryJackson, Nov 24, 2009 IP
  5. Nigel Lew

    Nigel Lew Notable Member

    Messages:
    4,642
    Likes Received:
    406
    Best Answers:
    21
    Trophy Points:
    295
    #5
    This is in fact an issue. Is this an e-commerce site? I am happy to help but I need some info.

    Nigel
     
    Nigel Lew, Nov 24, 2009 IP
  6. tommseo

    tommseo Peon

    Messages:
    285
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #6
    Its a software solutions based dynamic website.
     
    tommseo, Nov 25, 2009 IP
  7. tommseo

    tommseo Peon

    Messages:
    285
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #7
    all these urls are accessing the same page, so when i refresh content, it will be refreshed on all duplicate pages.
     
    tommseo, Nov 25, 2009 IP
  8. ashley0331

    ashley0331 Greenhorn

    Messages:
    1
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    11
    #8
    :)agree with you.
     
    ashley0331, Nov 25, 2009 IP
  9. pirneanicolaeovidiu

    pirneanicolaeovidiu Peon

    Messages:
    66
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #9
    Hi I do not know what is the problem. I have it as well on one of my sites...Regarding Robots.txt I had changed my description and this is why I had the problems, but then I have added the links to robots...But it takes more then 3 weeks until the new update....At least this happened for me. Still have 2 Not found items...
     
    pirneanicolaeovidiu, Nov 25, 2009 IP
  10. seoaccount

    seoaccount Peon

    Messages:
    499
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #10
    seoaccount, Nov 25, 2009 IP
  11. incomesinternational.com

    incomesinternational.com Peon

    Messages:
    187
    Likes Received:
    3
    Best Answers:
    0
    Trophy Points:
    0
    #11
    You have several options available but before I go there just ignore most of the garbage that's been written here.

    *Fresh content won't help you current situation.
    *There is no such thing as a penalty for duplicate content.

    Option 1: As already mentioned, use a 301 redirect on the url's that you don't want and point the dupe url to the preferred url.

    Option 2: Use Google webmaster tools and select a preferred url (this is only useful if the problem is between a www and non www version of your site).

    Option 3. Use the rel=canonical element. You can read about how to use that on the Google blog and webmaster guidelines.

    Good luck, I hope you sort it out soon.