Hello everyone, Just a quick question, is this a valid way to use robots.txt for subdomains? User-Agent: * Disallow: /example.domain.com/ Allow: / sitemap: http://www.domain.com/sitemap.xml Code (markup): Or do I need to create a new robots.txt file for subdomains? I checked Webmaster tools and it doesn't mention any errors but it's always best to be sure Thanks! Jack