Excluding subdmains

Discussion in 'robots.txt' started by frio, Oct 30, 2007.

  1. #1
    how would i be able to exclude the subdomain from crawling?
    can I just put this on the robots.txt file:

    disallow: subdomain.domain.com

    I think it wont work. can you suggest any other things about this?

    Thanks
     
    frio, Oct 30, 2007 IP
  2. leo

    leo Peon

    Messages:
    174
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #2
    What's letting you think it wouldn't work?
     
    leo, Oct 30, 2007 IP
  3. frio

    frio Guest

    Messages:
    40
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #3
    would it work? what you think?
     
    frio, Oct 30, 2007 IP
  4. Kuldeep1952

    Kuldeep1952 Active Member

    Messages:
    290
    Likes Received:
    18
    Best Answers:
    0
    Trophy Points:
    60
    #4
    To disallow a Sub-domain from crawling, you have to put the robots.txt file
    in the root of the sub-domain.

    To disallow all files, add the follwoing:

    User-agent: *

    Disallow: /
     
    Kuldeep1952, Oct 30, 2007 IP