Hello. Problem regarding bots. I have a robot txt added to my site and the bots are only viewing half my site and not the other. They are not veiwing the shop only the homepage ect. So i added a google sitemap and still the same thing is happening the bots i have visiting are msn bot ,google bot and yahoo slurp there is probs some more but these are the main ones. the maps seem to be ok. Can anyone shine a light on why there only visiting half of my site please..
Hi, Possible reasons why bots do not visit all the pages of your site : - your robots.txt disallows access to some parts of your site. - you use HTML tags that disallow access to some parts of your site. - some parts of your site are not crawlable (only reachable through a form or through JavaScript links, for example). - there are many similar pages in your site. - your site in general is not considered important enough by search engines to be completely indexed. - ... As you see, there are many very different potential reasons. Jean-Luc
The list is even longer than that. If, for example you are using a well known shopping cart program, then the bots won't go very deep into that portion of the website. Using sitemap.xml file however should take out most of those reasons, for example the Javascript reason, or even the google ranking. So something else could be wrong with your website.