So I submitted a sitemap to get crawled by google but it tried crawling it and I got this error.... "We can't currently access your home page because of a robots.txt restriction. Help." I dont think my robots.txt is setup wrong.... perhaps it is? Am I doing anything wrong as far as restricting the google bot? User-agent: * # disallow all files in these directories Disallow: /cgi-bin/ Disallow: /images/ Disallow: /stats/ Disallow: /styles/ Disallow: /*.css$ Disallow: /*.ini$ Disallow: /*.js$ Disallow: /*.wmv$ Disallow: /*.png$ Disallow: /*.gif$ Disallow: /*.jpg$ Disallow: /*.cgi$ # allow google image bot to search all images User-agent: Googlebot-Image Allow: /* # disallow archiving site User-agent: ia_archiver Disallow: / # disable duggmirror User-agent: duggmirror Disallow: / Code (markup):
I suggest you try putting only this User-agent: * Disallow: in the robots file and then see if you still get the error or not.