Quick question, For sub-domains, for example sub.domain.com, do you need to create an additional robots.txt file for that domain or can you use the root file to grant/restrict access to spiders and such?
Hi REtec, (well-behaving) robots will look for the robots.txt file in each individual root. This means you need a robots text in the root of every subdomain to be effective.
Awesome, thought this was so as when in webmaster tools I was having a heck of a time trying to figure out how to get a txt file to block a sub-domain in their little testing interface...so this makes a lot more sense. Thanks Jabz