I have a robots.txt file that is supposed to allow all bots and have no disallows. When I load up the address, it displays: User-agent: *Allow: /In Google Webmaster Tools, it says Line 0: http://www.xyz.com/robots.txt robots.txt file does not appear to be valid Then it says the text of xyz.com is: ?User-agent: *Allow: / Why does it have this question mark and why isn't it accepting this robots.txt? Does it think there is a space there?
The correct format in this particular case is: User-agent: * Disallow: Code (markup): This allows all bots to index your complete site.
there is no "Allow" command in robots.txt. if you whant to "allow all" you should remove the robots.txt or use the next code User-agent: * Code (markup):
Can this potentially cause search engines to incorrectly crawl my site? Or will they even ignore the site altogether?