Hello, I'm new to this forum, and to robots.txt. I placed a robots.txt file in the root directory of my site. When logged in to my google sitemap account, it shows that there are errors in my robots.txt file. It reads this: The specified CGI application misbehaved by not returning a complete set of HTTP headers. The headers it did return are: syntax error at D:\hshome\damien66\syfitness.com\robots.txt line 5, near "agent:" Search pattern not terminated at D:\hshome\damien66\syfitness.com\robots.txt line 10. My robots.txt file is exactly this: # Robots.txt file created by http://www.webtoolcentral.com # For domain: http://syfitness.com # All robots will spider the domain User-agent:* Disallow: # Disallow directory /cgi-bin/ User-agent:* Disallow: /cgi-bin Can someone please help, as I don't think google is crawling my site?
i believe you have given two sets of codes which contradict each other... first part says that all robots can spider all content.... next you say that all robots ignore cgi-bin... i would suggest...you remove off the first part and leave the second part only... i.e. # Robots.txt file created by http://www.webtoolcentral.com # For domain: http://syfitness.com # Disallow directory /cgi-bin/ User-agent:* Disallow: /cgi-bin so now ur robots.txt will allow all spiders to index everything except cgi-bin directory... hope this helps...