Hi, I have my robots setup so that it wont spider a folder on my server called dev.. (Disallow: /dev/) I have a page called dev.html in my root folder which I want to be spidered by google etc.. but google webmaster reports that it is restricted by robots.txt.. does anyone understand this? or how to fix it? many thanks. Shaun
Its at http://shacow.com/robots.txt But yes, it looks like I have done it wrong. User-Agent: * Disallow: /dev Disallow: /whois Allow: / Whats the right way to write it so the google bot will now crawl anything contained in the dev folder, but allow the dev.html file? thanks, Shaun
Why you don't consult Google Webmaster Central? http://www.google.com/webmasters/ You can set online there all you wish.