Old versions of my site on subdomain hurt?

Discussion in 'Google' started by abercrombie, Jul 17, 2008.

  1. #1
    hi ya'll, i'm thinking of putting an old versions of my site on a subdomain. such as v1.domain.com or v2.domain.com. my site is 9 years old and i think it's kinda neat looking back instead of visiting that goback machine archive site. i've read how google sees subdomains as separate sites so would it hurt me for duplicate content? or should i just put it in a sub directory

    domain.com/v1
     
    abercrombie, Jul 17, 2008 IP
  2. hooperman

    hooperman Well-Known Member

    Messages:
    646
    Likes Received:
    23
    Best Answers:
    0
    Trophy Points:
    108
    #2
    If you have duplicate content anywhere it's open to getting filtered out the SERPs. But which version? To be safe, why don't you exclude bots from those domains or folders using robots.txt?
     
    hooperman, Jul 17, 2008 IP
  3. BILZ

    BILZ Peon

    Messages:
    1,515
    Likes Received:
    62
    Best Answers:
    0
    Trophy Points:
    0
    #3
    If the new site will have all new text then you should be fine -in fact having both might provide an advantage. If its all the same text as the old then use hooperman's suggestion and restrict the SEs with robots.txt
     
    BILZ, Jul 17, 2008 IP
  4. egosoccer

    egosoccer Peon

    Messages:
    29
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #4
    The question is - what is new version? The text is the same. Is it with just new design?
     
    egosoccer, Jul 17, 2008 IP
  5. kentuckyslone

    kentuckyslone Notable Member

    Messages:
    4,371
    Likes Received:
    367
    Best Answers:
    0
    Trophy Points:
    205
    #5
    Yes, it would hurt. Even if you use robots.txt to exclude there are some SEs which will not "obey". Ses look at subdomains as a seperate site and if you have duplicate content there it will be likely to harm your SERPs
     
    kentuckyslone, Jul 17, 2008 IP
    Smyrl likes this.
  6. hooperman

    hooperman Well-Known Member

    Messages:
    646
    Likes Received:
    23
    Best Answers:
    0
    Trophy Points:
    108
    #6
    Well, technically it doesn't matter whether the duplicate content exists on the same domain or a different one. It gets filtered out all the same.

    It won't harm the rankings of your other pages, and you will still retain one indexed version of the content, the problem is that you can't control which version that is. :)
     
    hooperman, Jul 17, 2008 IP
  7. shabbirbhimani

    shabbirbhimani Well-Known Member

    Messages:
    1,063
    Likes Received:
    14
    Best Answers:
    0
    Trophy Points:
    140
    #7
    Its always better to get them into password protected directory or have a robots.txt file to block them.
     
    shabbirbhimani, Jul 17, 2008 IP