Hi i have been adding sitemaps using google's sitemap section for a while now, but recently they have been returning errors: now heres an example robots.txt: http://fajja.co.uk/robots.txt which is clearly accesible, but why cant google see it?! I fear i have done something wrong, because all my robots.txt files on all my sites (same hosting) now show ERRORs
Hey Fajja, Try using the following: Disallow: Code (markup): Yes that's right. "Disallow:" with nothing after it tells the web spiders to look at everything... The Allow in your current robots.txt's line... Allow: / Code (markup): Tells all the web spiders, certainly Google, to Only Index the root.. in this case... http://fajja.co.uk/ And to ignore any other page on your site from it's index. You have about 335 pages in google now... I think you might want to change this as soon as possible... if the spiders don't see your other pages... they will 'assume' they don't exist and slowly, but surely remove them from their respective indexes. (indices?). Check out the following site for more info: * Web Robots Pages * "Allow" Directive in Robots.txt Also, I noticed your site was a Wordpress blog, have you taken care of Wordpress Security? So make the change... 1. Allow -> Disallow 2. / -> nothing And your sites should work fine. Let me know if it worked for you, PD
ye it was a problem with a new security addon at my companies hosting. all sorted now so hopefully google will come crawling back!!!