i want to make a robots.txt that are allow to: / /a/ /b/ and thing things in the those folder should be cached. but there folders like c, d ,e , and many more. User-agent: * Allow: / Allow: /a/ Allow: /b/ Disallow:/* Code (markup): does that one do the trick?
if you want to allow all that folders in robots.txt. don't specify just write Allow: / and if you don't want to crawl particular folder then specify it with the Disallow: /foldername.
there a lot of folders i do not want to crawl , it more than the one i want to crawl like 10 folders, the one i wrote save more space. will that one works?
If you want that your particular folders should be accessed by crawlers and rest of your website should not be accessed. You can use following code: User-agent: * Allow: /a/ Allow: /b/ Disallow: / Also, it would be better if you can specifically mention your query like what exactly you want. So that one can provide you with the exact solution.