How to block any subdomain by Robots.txt

Discussion in 'Search Engine Optimization' started by anandsapple, Dec 14, 2012.

  1. #1
    Friends

    I want to block a subdomain by Robots.txt. Actually i do not want to display those pages in search engines so kindly advice what should i do?


    Thanks
     
    anandsapple, Dec 14, 2012 IP
  2. deepak.pal

    deepak.pal Active Member

    Messages:
    171
    Likes Received:
    8
    Best Answers:
    0
    Trophy Points:
    80
    #2
    Simply upload a robots.txt file in the root directory of your subdomain. Search engines normally treats a subdomain as a different domain while crawling. So you can put different robots.txt on every subdomain. Simply write following and every page are forbidden for crawling by all search engines:

    User-agent: *
    Disallow: /

    You can replace * by any name of a bot according to your need. * will forbid every bot. Hope this will help you.
     
    deepak.pal, Dec 14, 2012 IP
    monfis likes this.