Hi all, I have two sub-domains in my main domain, I want to disallow my both sub-domains from SE crawler or spider but I want to allow for main domain so what action I do… Please suggest me…
Sub domain has to be submitted individually from the main domain. That’s why you also have to submit separate robots file!
Subdomain root directory robots.txt should include, User-agent: * Disallow: / and Main root directory robots.txt should include: User-agent: * Disallow: /subdomain_directory/