I serached the forum but could not find an amswer to my question...and in other forums all i get is a bunch of crap that does not answer my question. Problem. I had a site running fine then about 2 weeks ago I had all the pages of my site say "robots.txt unreachable" My site map is loaded and active in webmaster tools I created a robots.txt style as follows as I want all my pages crawled: User-agent: * Allow: / I have nothing after the "/" symbol..Do I need to have each page in my site listed after the / ? I saved the robots.txt file and uploaded to my main directly in Go Daddy. Some of the pages cleared by my main page (index) page still says "robots.txt unreachable" What am I doing wrong??? I did see somewhere on here that this is correct: User-agent: * Disallow: (no "/" after) ...but google automatically gave me that code in my download Also, why did this just happen when my site has been indexed for over 4 years?
This may happened due to Google make changes to their webmaster tool. Resubmit your sitemap in Webmaster tool and see few hours after if error cleared or not?