I run a site that I simply do not want google to crawl whatsoever, it is very proprietary information for my client's eyes only. If I put a robots.txt file that says: User-agent: * Disallow: / in my root web directory will this block google from the root directory AND all subdirectories? Or do I need to put a robots.txt with the same thing in every single subdirectory? Thanks!
Yes it will block the all website. you would not be needed to add any more robots.txt in sub directory root file.