I uploaded a robots.txt file and a sitemap to my sites. I used this robots.txt User-agent: * Disallow: /images/ Disallow: /App_Themes/ Disallow: /bin/ Disallow: /CommImages/ Disallow: /ProImages/ User-agent: Googlebot-Image Disallow: /*.gif$ Disallow: /*.jpeg$ Disallow: /*.jpg$ Disallow: /*.bmp$ I get this error in google webmaster tools when I submit sitemap Network unreachable: robots.txt unreachable We were unable to crawl your Sitemap because we found a robots.txt file at the root of your site but were unable to download it. Please ensure that it is accessible or remove it completely. I removed the robots file and I still get his error? Any Ideas?
Im a noob so I will try to explain LOL what I did. It is a basic html so i just ftp it in the folder with the index.html, pic etc.... as for permissions I didnt do any of that I just copied the robots.txt from a thread on DP and saved it in note pad and ftp it to the folder with the website files in it.
What do you mean by "basic html"? If you saved the file as robots.txt and uploaded to root folder it should be working...To check permissions (in ftp client) you should be able to right click the uploaded file and enter 644.