I'm a bit confused about Robot txt! Is it best to write this: User-agent: * Disallow: or this User-agent: * Allow: / Is it okay to do this: User-agent: * Disallow: Allow: / I understand that Disallow: will allow all search engines to crawl the site and all the pages within the domain and Disallow: / will stop them all, however I,m not sure how the allow: and allow: / works as when added to a Google webmaster tool they tell you to write the allow: / in to the robot which then gets a 200 (Success) status. When I use Disallow: to a Google webmaster tool Google Boot results as followed: Allowed by line 6: Disallow: Detected as a directory; specific files may have different restrictions Can someone explain? Thanks Gary
if you are suing a wordpress then just remove this and go to settings and check the option to be viewed by all search engines automatically your WP will produce one new one if not WP then tell i will find a good code
Thanks shan, but this doesn't really answer my question as I not only want the answer for blogs but for a web site in general using a domain. Any other comments would be helpful? Gary