How would I know if robots are excluding my folders for crawling? I've already created a robots.txt wherein I disallow the folders that I don't want to be crawled
Easiest way to tell is to do a search for www.domainname/folder and see if any results come up. However if you have it in your robots.txt file then chances are that part of your site is not indexed.
Thank you guys. I typed in www.domain.com/folder in G search and I can't seem to find the www.domain.com/folder im looking for. But when I typed in site:www.domain.com/folder there are results... does this means that the robot exclusion I've made is not working???
What that means is that your links are pointing to www. instead of / What does your robots.txt file look like?