help with robots.txt

Discussion in 'robots.txt' started by samib, Jun 15, 2006.

  1. #1
    hi. i have a web hosting diretory that consists of 3 domains, the main domain and 2 sub domains. in these 3 domains i have 3 different hosting provider. from USA uk and canada. the usa is the main and the uk and canada are subdomains. i have an articles section in each of them that is identically the same so i need to create a robots.txt that disallows the entry to uk.betahosts.com/articles and ca.betahosts.com/articles. now should it look something like this since subdomains are normal folder created on the webserver..

    User-agent: *
    Disallow: /uk/articles/
    Disallow: /ca/articles/
    Disallow: /cgi-bin/

    or it will disallow the whole uk and ca instead of the articles section?
    please advice.. thanks
     
    samib, Jun 15, 2006 IP
  2. Jean-Luc

    Jean-Luc Peon

    Messages:
    601
    Likes Received:
    30
    Best Answers:
    0
    Trophy Points:
    0
    #2
    Hi again,

    You need several robots.txt files :

    http ://www.betahosts.com/robots.txt
    User-agent: * 
    Disallow: /uk/
    Disallow: /ca/
    Code (markup):
    http ://uk.betahosts.com/robots.txt
    User-agent: * 
    Disallow: /articles/
    Code (markup):
    http ://ca.betahosts.com/robots.txt
    User-agent: * 
    Disallow: /articles/
    Code (markup):
    When a robot enters the main domain, it gets no access to the /uk and /ca subdirectory. When a robot enters one of the subdomains, it does not get access to the /articles subdirectory.

    Jean-Luc
     
    Jean-Luc, Jun 16, 2006 IP