Im almost totally lost when it comes to these types of things but what do I put in my robots.txt file, when I want everythings on www.mysite.com crawled, and also everything in www.mysite.com/MyStore crawled except this: User-agent: * Disallow: /admin/ Disallow: /cont/ Disallow: /scripts/ Disallow: /themes/ Disallow: /web.config What does my final robots file look like then? sorry if its a dumb question guys, but I want to be sure.. Thinking 2 seconds over it, it would be logical that I should use this robots.txt: User-agent: * Disallow: Mystore/admin/ Disallow: Mystore/cont/ Disallow: Mystore/scripts/ Disallow: Mystore/themes/ Disallow: Mystore/web.config Is this correct?
I think second one is correct not sure but it seems right to me. first one does not look right for some reason
If you are using second robots.txt file: User-agent: * Disallow: Mystore/admin/ Disallow: Mystore/cont/ Disallow: Mystore/scripts/ Disallow: Mystore/themes/ Disallow: Mystore/web.config then how it is possible to restrict www.mysite.com/admin and other folders from SE robots..