Hi, I have many subdirectory in one directory websites can i use only Disallow: /websites/ or i must use all subdirectory in robots.txt like Disallow: /websites/subdirectory-08 Disallow: /websites/subdirectory-09 Disallow: /websites/subdirectory-06 ............. Thanks
Hi consultsoft This will block robots from your websites directory (including files and subdirectories in that directory). You would use this method if you only want to block certain files or directories within the websites directory. You can also block all robots (or particular robots) using a User-agent entry in your robots.txt file. For example: User-agent: * will block all robots (some robots may still choose to ignore the robots.txt file though). You might like to visit http://www.robotstxt.org/ for more details on configuring your robots.txt file. I hope that helps.
if you want to disallow robots from entering your websites directory you can use the first one but, if you want to disallow some specific sub-directories within websites diretory then you can use the second one. if you want to learn about robots.txt file you can visit robotstxt.org. here you will find answers to all your questions
if you want to block your all subdirectories, you can use following code: Disallow: /websites/subdirectory-*