I have recently created a new site, which seems to be taking longer than usual to get indexed by google. I created and submitted a sitemap, but in google webmaster tools, its coming up an error saying a robots.txt file was found but we couldnt download it. I dont have a robots.txt file, i checked the robots.txt anylise tool in webmaster tools and it returns googlebot allowed. Has anyone else every had any problems like this?? Thanks Alan
No today also i updated my site map , within a hr verified. How u submit the site map, the format is correct?
http://xmlsitemap.com/create-sitemap/ use that link to create free xml sitemaps, as for the robots q, it might have been a glitch, get backlinks (high quality) and u will get indexed....
Thanks i will try that sitemap generator, i was using a different site to generate mine. I do have some good backlinks, ive dugg it etc, yahoo has indexed it no probs, but google seems to delaying. Will try out that new sitemap, also i have actually created a robots.txt file now allowing all bots, but will need to wait till tmrw before it all gets checked again by google. Thanks alot.
Ok... Problem again. Everything seemed to go ok, google checked the sitemap and returned it as all ok. Now after google have tried to crawl my site, it has returned robots.txt unreachable again. and as a result they wont be added to the index. The robots.txt file is reachable if i key the address into the address bar. i have checked the header and it returns HTTP/1.1 200 OK is there any advice from the digital point guru's??