how would i be able to exclude the subdomain from crawling? can I just put this on the robots.txt file: disallow: subdomain.domain.com I think it wont work. can you suggest any other things about this? Thanks
To disallow a Sub-domain from crawling, you have to put the robots.txt file in the root of the sub-domain. To disallow all files, add the follwoing: User-agent: * Disallow: /