User-agent: * Disallow: / ================= This is how my current robots.txt file looks like. I want to block a subdomain (i.e http://subdomain.mywebsite.com) using robots.txt, I don't want even a single page of my subdomain to be indexed in Google. I added the above text to prevent my Subdomain from indexing... but it still got indexed into Google. What should I do to entirely prevent my Subdomain from Indexing.
Create a second robots.txt for the subdomain. robots.txt can only contain instructions for the folders/sub folders, not sub domains.
Subdomains are treated as a different sites. You can use the same code (Disallow: / ) but make sure it works with "subdomain . mywebsite . com/robots.txt" only and not with "www . mywebsite . com/robots.txt" etc., else you main site pages will also gets blocked.
After creating a new robots.txt file for your subdomain do not forget to ust "Delete URL" option in your Google Webmaster as disallowing pages from indexing in robots does not delete those pages which have already been indexed and cached by Google. Good luck
Do you mean Instead of: User-agent: * Disallow: / I must have User-agent: * Disallow: /http://subdomain.mywebsite.com
Yes I have the robots.txt file uploaded in the root of my Sub Domain and not the main domain. I have done this for more than one month now and still have 23,600 pages indexed in Google. 23,600 has been showing up since the time I uploaded the robots.txt file, the number never seems to be going down and thats what making me worry.
Could you let me know how does your robots.txt looks like? I mean what condition have you stated in your robots to block the subdomain? Like I have stated in my last post above.