I run a site that I simply do not want google to crawl whatsoever, it is very proprietary information for my client's eyes only. If I put a robots.txt file that says: User-agent: * Disallow: / in my root web directory will this block google from the root directory AND all subdirectories? Or do I need to put a robots.txt with the same thing in every single subdirectory? Thanks!
If you do not wish your website to be indexed in any of the search engines as there is data that is only meant for your clients, then you can simply use the code User-agent: * Disallow: / It will block you whole website. but blocking a website through robots.txt doesn't guarantee that your website would not be indexed. If crawlers will find your website links anywhere, they might crawl your website and might index the same. To avoid this situation, you can use following meta tag in your all website pages <meta name="ROBOTS" content="NOINDEX,NOFOLLOW"> OR You can make your website password protected using htaccess and could share the password to your clients so that they can check the details.