block root directory and all subdirectories?

Discussion in 'robots.txt' started by arcademan, Jun 29, 2010.

  1. #1
    I run a site that I simply do not want google to crawl whatsoever, it is very proprietary information for my client's eyes only.

    If I put a robots.txt file that says:

    User-agent: *
    Disallow: /


    in my root web directory will this block google from the root directory AND all subdirectories? Or do I need to put a robots.txt with the same thing in every single subdirectory?

    Thanks!
     
    arcademan, Jun 29, 2010 IP
  2. Imozeb

    Imozeb Peon

    Messages:
    666
    Likes Received:
    3
    Best Answers:
    0
    Trophy Points:
    0
    #2
    Just put the

    User-agent: *
    Disallow: /

    in your root directory and you should be fine.
     
    Imozeb, Jul 6, 2010 IP
  3. manish.chauhan

    manish.chauhan Well-Known Member

    Messages:
    1,682
    Likes Received:
    35
    Best Answers:
    0
    Trophy Points:
    110
    #3
    If you do not wish your website to be indexed in any of the search engines as there is data that is only meant for your clients, then you can simply use the code

    User-agent: *
    Disallow: /

    It will block you whole website. but blocking a website through robots.txt doesn't guarantee that your website would not be indexed. If crawlers will find your website links anywhere, they might crawl your website and might index the same. To avoid this situation, you can use following meta tag in your all website pages
    <meta name="ROBOTS" content="NOINDEX,NOFOLLOW">

    OR

    You can make your website password protected using htaccess and could share the password to your clients so that they can check the details.
     
    manish.chauhan, Jul 7, 2010 IP