3 times in a row im getting an error when submitting with My sitemap [ "Network unreachable: robots.txt unreachable We were unable to crawl your Sitemap because we found a robots.txt file at the root of your site but were unable to download it. Please ensure that it is accessible or remove it completely." At first I didn't even have a robots file and got the error, but decided to add one to see what happens but am still getting an error. thoughts?
Just delete the robot.txt file from your site. This file is used to exclude pages from the indexing spiders. Eliminate it and you should be fine.
robot.txt file essential for excluding addon domians from the main domain. Is there any solution without removing the robot.txt file?