robots.txt is this a valid way to declair subdomains?

Discussion in 'Programming' started by zomex, Apr 18, 2011.

  1. #1
    Hello everyone,

    Just a quick question, is this a valid way to use robots.txt for subdomains?

    
    User-Agent: *
    Disallow: /example.domain.com/
    Allow: /
    
    sitemap: http://www.domain.com/sitemap.xml
    
    
    Code (markup):
    Or do I need to create a new robots.txt file for subdomains?

    I checked Webmaster tools and it doesn't mention any errors but it's always best to be sure :)

    Thanks!
    Jack
     
    zomex, Apr 18, 2011 IP