I serached the forum but could not find an amswer to my question...and in other forums all i get is a bunch of crap that does not answer my question. Problem. I had a site running fine then about 2 weeks ago I had all the pages of my site say "robots.txt unreachable" My site map is loaded and active in webmaster tools I created a robots.txt style as follows as I want all my pages crawled: User-agent: * Allow: / I have nothing after the "/" symbol I saved the robots.txt file and uploaded to my main directly in Go Daddy. Some of the pages cleared by my main page (indes) page still says "robots.txt unreachable" What am I doing wrong???