Hello Seo Experts, I have interesting question that is really create confusion in my mind. The question related to robots.txt can anyone tell me what is the difference between both text of robots.txt? (1) User-agent: * Disallow: (2) User-agent: * Allow: / Can any seo expert tell me what is the difference?
1 is saying disallow all user agents 2 is saying allow all user agents. Use 1 if you want to block bots from your site.
Both of the codes you have listed are used for "Allowing all bots to crawl all of your pages". Difference: As I said there is no difference in the function of these two codes, but the second code is the standard code for this function after 2008 REP (Robots Exclusion Protocol) update. In 2008 update a new derivative "Allow" was introduced and now it is used (Allow: /) instead of using (Disallow: ). Most of the major search engines follow latest standard so it is better to use second code.
There is no "Allow" directive in the Robots Exclusion Standard. In fact, the Robots Exclusion Standard has only two directives - "User-agent" and "Disallow" - and later efforts to extend the standard with added directives are not meeting with much support. Individual spiders... notably the Google and Microsoft crawlers... have added support for all sorts of additional directives, but there is no real guarantee that any web spider will read or understand any of them. You can reliably expect that any spider visiting your site will either understand "User-agent" and "Disallow," or not care about your robots.txt at all. (A spider does not even have to read what robots.txt says, let alone do any of it. Most major spiders do, but anyone could write one that doesn't.) So only the first declaration is legal. However, since it is in every way equivalent to not having a robots.txt file at all, one must wonder why you would bother.