Friends I want to block a subdomain by Robots.txt. Actually i do not want to display those pages in search engines so kindly advice what should i do? Thanks
Simply upload a robots.txt file in the root directory of your subdomain. Search engines normally treats a subdomain as a different domain while crawling. So you can put different robots.txt on every subdomain. Simply write following and every page are forbidden for crawling by all search engines: User-agent: * Disallow: / You can replace * by any name of a bot according to your need. * will forbid every bot. Hope this will help you.