Very strange question regarding duplicate content penalities

Discussion in 'Search Engine Optimization' started by david_sakh, Jan 12, 2005.

  1. #1
    Would it be ok to test an up-and-coming website that will be hosted somewhere else on an already existing and unrealated website's directory?

    I mean, suppose google crawls the page before I move it the new host, would it be considered duplicate content even after I move it?

    And if so, how do you make a page un-crawlable.
     
    david_sakh, Jan 12, 2005 IP
  2. Smyrl

    Smyrl Tomato Republic Staff

    Messages:
    13,740
    Likes Received:
    1,702
    Best Answers:
    78
    Trophy Points:
    510
    #2
    Alternative 1: To make a page or directory uncrawable by Google use a robots.txt file. Do an online search for robots.txt tutorial.

    Alternative 2: Do a 301 redirect for pages when you put online elsewhere. I think robots.txt would be easier.

    Shannon
     
    Smyrl, Jan 12, 2005 IP
    david_sakh likes this.
  3. RVB

    RVB Member

    Messages:
    5
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    36
    #3
    Hi Folks,

    The only way I got google to stop crawling two websites with the same content, was to use the 'noindex' in the meta tags. I had tried stopping their bot through a 'robots' file but it didn't work for some reason.

    All the best for 2005

    RVB Pix

    www.bytephoto.com
     
    RVB, Jan 12, 2005 IP
  4. david_sakh

    david_sakh Peon

    Messages:
    1,225
    Likes Received:
    29
    Best Answers:
    0
    Trophy Points:
    0
    #4
    Thanks guys, I used:

    <meta name="robots" content="noindex,nofollow">
     
    david_sakh, Jan 12, 2005 IP