Keep getting robots error in google webmaster tools for sitemap

Discussion in 'Google Sitemaps' started by domainer_10, Jan 11, 2009.

  1. #1
    3 times in a row im getting an error when submitting with My sitemap [


    "Network unreachable: robots.txt unreachable
    We were unable to crawl your Sitemap because we found a robots.txt file at the root of your site but were unable to download it. Please ensure that it is accessible or remove it completely."



    At first I didn't even have a robots file and got the error, but decided to add one to see what happens but am still getting an error. thoughts?
     
    domainer_10, Jan 11, 2009 IP
  2. Tropicsforme

    Tropicsforme Peon

    Messages:
    7
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #2
    Just delete the robot.txt file from your site. This file is used to exclude pages from the indexing spiders. Eliminate it and you should be fine.
     
    Tropicsforme, Jan 13, 2009 IP
  3. domainer_10

    domainer_10 Peon

    Messages:
    1,720
    Likes Received:
    24
    Best Answers:
    0
    Trophy Points:
    0
    #3
    Thanks, but its working now. For some reason I checked the next morning and it worked.
     
    domainer_10, Jan 13, 2009 IP
  4. sampathsl

    sampathsl Guest

    Messages:
    861
    Likes Received:
    6
    Best Answers:
    0
    Trophy Points:
    0
    #4
    robot.txt file essential for excluding addon domians from the main domain. Is there any solution without removing the robot.txt file?
     
    sampathsl, Jan 13, 2009 IP