Hi, Is there a single command to prevent a crawler from going into any subfolder? The reason I am asking this is, it seems google is picking up subfolders that don't even exist on my site. So just excluding each folder separately is not enough, I need a general command. Thanks
User-agent: * Disallow: / Code (markup): This will block all bots that follow the robots.txt rules from crawling any folder in your public_html folder.