Hi, In 'web crawl errors' under 'unreachable URLs' I get the following message: 'URL unreachable /robots.txt unreachable' therefore google has postponed their crawl of my site. My robots.txt file is there in the root directory and named correctly. Does anyone know why google says it's 'unreachable' not 'not found' and how I can fix this? Any help appreciated. argent
yea, google can help you validate, go to the sitemaps area. Can you see it in your browser when you type: www.sitename.com/robots.txt?
I see this same error as well. In Google Sitemaps it says "5xx error robots.txt unreachable". I can access the robots.txt in my browser... in the root directory. Anyone know what could be up? The robots.txt file has 644 permissions.