Robot.txt Kicking My A$$ Please Help!

Discussion in 'robots.txt' started by rjd1265, Feb 2, 2012.

  1. #1
    I serached the forum but could not find an amswer to my question...and in other forums all i get is a bunch of crap that does not answer my question.

    Problem.
    I had a site running fine then about 2 weeks ago I had all the pages of my site say "robots.txt unreachable"
    My site map is loaded and active in webmaster tools

    I created a robots.txt style as follows as I want all my pages crawled:
    User-agent: *
    Allow: /

    I have nothing after the "/" symbol..Do I need to have each page in my site listed after the / ?

    I saved the robots.txt file and uploaded to my main directly in Go Daddy.

    Some of the pages cleared by my main page (index) page still says "robots.txt unreachable"

    What am I doing wrong???

    I did see somewhere on here that this is correct:
    User-agent: *
    Disallow: (no "/" after)
    ...but google automatically gave me that code in my download

    Also, why did this just happen when my site has been indexed for over 4 years?
     
    Last edited: Feb 2, 2012
    rjd1265, Feb 2, 2012 IP
  2. kar76

    kar76 Greenhorn

    Messages:
    64
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    18
    #2
    This may happened due to Google make changes to their webmaster tool. Resubmit your sitemap in Webmaster tool and see few hours after if error cleared or not?
     
    kar76, Feb 5, 2012 IP
  3. tiffanywilliams12i2

    tiffanywilliams12i2 Peon

    Messages:
    164
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #3
    do you have email? we can talk more about this
     
    tiffanywilliams12i2, Apr 10, 2012 IP