block root directory and all subdirectories?

Discussion in 'robots.txt' started by arcademan, Jun 29, 2010.

  1. #1
    I run a site that I simply do not want google to crawl whatsoever, it is very proprietary information for my client's eyes only.

    If I put a robots.txt file that says:

    User-agent: *
    Disallow: /


    in my root web directory will this block google from the root directory AND all subdirectories? Or do I need to put a robots.txt with the same thing in every single subdirectory?

    Thanks!
     
    arcademan, Jun 29, 2010 IP
  2. manish.chauhan

    manish.chauhan Well-Known Member

    Messages:
    1,682
    Likes Received:
    35
    Best Answers:
    0
    Trophy Points:
    110
    #2
    Yes it will block the all website. you would not be needed to add any more robots.txt in sub directory root file.
     
    manish.chauhan, Jun 30, 2010 IP