I am trying to disallow robots from indexing my support center and forum, but I still want them to index my main site. My problem is that they are on a subdomain, although the document root is still /forums and /support. So I want them to index mysite.com but not forums.mysite.com (mysite.com/forums) support.mysite.com (mysite.com/support) When you create subdomains the files go in the /folder (in brackets). Should I put robots.txt in mysite.com/robots.txt and say disallow: /forums disallow: /support or put it in the subdomain folder (/forums) and have it say disallow: / I don't want to put the wrong thing and end up with my whole site not being indexed. I just want to make sure that by setting it to disallow everything in the subdomain, I don't dissallow the main site also. Thanks in advanced.
The one above will disallow: mysite.com/forums mysite.com/support So that won't be good for you. Anyone, pls correct me if I am wrong....
User-agent: * Disallow: /forums Disallow: /support That's all you need to have in your robots.txt. Don't worry, the search engine's will index the rest of your website normally.