hi. i have a web hosting diretory that consists of 3 domains, the main domain and 2 sub domains. in these 3 domains i have 3 different hosting provider. from USA uk and canada. the usa is the main and the uk and canada are subdomains. i have an articles section in each of them that is identically the same so i need to create a robots.txt that disallows the entry to uk.betahosts.com/articles and ca.betahosts.com/articles. now should it look something like this since subdomains are normal folder created on the webserver.. User-agent: * Disallow: /uk/articles/ Disallow: /ca/articles/ Disallow: /cgi-bin/ or it will disallow the whole uk and ca instead of the articles section? please advice.. thanks
Hi again, You need several robots.txt files : http ://www.betahosts.com/robots.txt User-agent: * Disallow: /uk/ Disallow: /ca/ Code (markup): http ://uk.betahosts.com/robots.txt User-agent: * Disallow: /articles/ Code (markup): http ://ca.betahosts.com/robots.txt User-agent: * Disallow: /articles/ Code (markup): When a robot enters the main domain, it gets no access to the /uk and /ca subdirectory. When a robot enters one of the subdomains, it does not get access to the /articles subdirectory. Jean-Luc