I've just realised, by logging into Google Sitemaps and looking in to "URLs restricted by robots.txt" - that some of my website pages are marked by Google Sitemaps at "URL restricted by robots.txt". Now how can this be if my robots txt file looks like so: # All robots will spider the domain User-agent: * Disallow: # Disallow directory /phpads/ User-agent: * Disallow: /phpads/ I specifically do not block anything - howcome Google sitemaps is not reading this?!?!
BIG oops - I just realise I posted this in the wrong forums - can some mod please move this to the Google Sitemaps forum? Many thanks - and sorry! I'm lacking caffeine today! :-//
Hi, Your robots.txt file is not correct. You cannot have a set of lines saying "I do not disallow anything" and then a set of lines saying "I disallow /phpads/". I would remove the first set of rules. Jean-Luc
You're right - that the robots.txt file contradicting itself. I'll fix it right away! Thanks Jean-Luc