Sub-domains and Robots.txt

Discussion in 'robots.txt' started by REtec, Sep 26, 2011.

  1. #1
    Quick question,

    For sub-domains, for example sub.domain.com, do you need to create an additional robots.txt file for that domain or can you use the root file to grant/restrict access to spiders and such?
     
    Solved! View solution.
    REtec, Sep 26, 2011 IP
  2. #2
    Hi REtec,

    (well-behaving) robots will look for the robots.txt file in each individual root.
    This means you need a robots text in the root of every subdomain to be effective.
     
    jabz.biz, Sep 26, 2011 IP
  3. REtec

    REtec Peon

    Messages:
    61
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #3
    Awesome, thought this was so as when in webmaster tools I was having a heck of a time trying to figure out how to get a txt file to block a sub-domain in their little testing interface...so this makes a lot more sense.

    Thanks Jabz
     
    REtec, Sep 29, 2011 IP