1. Advertising
    y u no do it?

    Advertising (learn more)

    Advertise virtually anything here, with CPM banner ads, CPM email ads and CPC contextual links. You can target relevant areas of the site and show ads based on geographical location of the user if you wish.

    Starts at just $1 per CPM or $0.10 per CPC.

Disallow Sub-domain From Robots.txt ???

Discussion in 'robots.txt' started by mlao, May 7, 2011.

  1. #1
    Hello
    I am useing a sharing hosting and I have 2 domains.

    My primacy domain is travelersmeeting.com
    what I notice is that Google index pages from my sub-domain thenewagetraveler.com
    and give to my site travelersmeeting.com, as a result Is looks like I have duplicated content....
    Exable:
    normal page: www.thenewagetraveler.com/Thailand-Chiang-Mai..html
    duplicated page:
    www.travelersmeeting.com/thenewagetraveler/Thailand-Chiang-Mai..html
    I try to stop this, by add this lines to robots.txt file that located in my primacy domain under the public.html folder
    Disallow: /thenewagetraveler/

    User-agent: Googlebot
    Noindex: /thenewagetraveler/
    Is that correct?
    I just want Google index my sub domain thenewagetraveler but not crow and index the same url with-in travelersmeeting.com
    It will work or I have to do something more?
     
    mlao, May 7, 2011 IP
  2. mlao

    mlao Active Member

    Messages:
    321
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    53
    #2
    Anyone????
     
    mlao, May 7, 2011 IP
  3. manish.chauhan

    manish.chauhan Well-Known Member

    Messages:
    1,682
    Likes Received:
    35
    Best Answers:
    0
    Trophy Points:
    110
    #3
    Since you are using 2 separate domains, you must add the robots.txt in your second domain thenewagetraveler.com as well. Just add robots.txt in thenewagetraveler.com and use following instructions there:

    User-agent: *
    Disallow: /
     
    manish.chauhan, May 18, 2011 IP
  4. mlao

    mlao Active Member

    Messages:
    321
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    53
    #4
    If I do that than google not index
    www.thenewagetraveler.com
    what I want is to seperate the 2 domains
    travelersmeeting primacy domain to not index pages from thenewagetraveler.com
    but thenewagetraveler.com can be index as well....
     
    mlao, May 18, 2011 IP
  5. SSC

    SSC Active Member

    Messages:
    995
    Likes Received:
    10
    Best Answers:
    0
    Trophy Points:
    80
    #5
    You have created a CONFUSING thread, first of all you need to know the meaning of a sub domain
    because, as per your question you have two different TLD's and not a TLD with a sub domain
    Some one will be able to point you in the right direction if you bother to make your question a little bit descriptive without confusing the reader
     
    SSC, May 18, 2011 IP
  6. manish.chauhan

    manish.chauhan Well-Known Member

    Messages:
    1,682
    Likes Received:
    35
    Best Answers:
    0
    Trophy Points:
    110
    #6
    Now got your point, juts add following instructions in your robots.txt of travelersmeeting.com:

    User-agent: *
    Disallow: /thenewagetraveler/

    This would restrict crawlers to visit all of the pages of thenewagetraveler on your travelersmeeting.com and would keep your thenewagetraveler.com domain well intact. Hope this would solve your problem.
     
    manish.chauhan, May 18, 2011 IP
  7. norprowebstore

    norprowebstore Member

    Messages:
    126
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    48
    #7
    Unless you are clear what a sub-domain is ...
     
    norprowebstore, May 19, 2011 IP
  8. mlao

    mlao Active Member

    Messages:
    321
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    53
    #8
    I'm in a sharing hosting and I have 2 different domain under the same hosting 1 is primacy and the other is sub-domain!

    for: manish.chauhan
    I have already add in my robots.txt travelersmeeting.com:

    User-agent: *
    Disallow: /thenewagetraveler/

    I just ask if this going to solve the problem...
    Thank you.
     
    mlao, May 19, 2011 IP
  9. manish.chauhan

    manish.chauhan Well-Known Member

    Messages:
    1,682
    Likes Received:
    35
    Best Answers:
    0
    Trophy Points:
    110
    #9
    yup this would work perfectly fine.
     
    manish.chauhan, May 24, 2011 IP