Hi to all, I don't want to restrict bots for crawled to all pages for my website. So, in my robots.txt file I write only the below code. Is this affect anythings in my website? Please let me know about that. User-agent:* Disallow:
This code means that crawler will read everything in your website. Few pages like term & condition etc need to disallow because these pages may have some duplicate content
Your robots.txt file code needs to be changed as User-agent:* Allow: Please do these changes. It will help you.
I completely agree with Pavan, TOC and login pages are some of the few pages that should be disallowed by crawling.... it can affect your site at some extent...