Can anybody explain me what does this robots.txt code means? User-Agent: Googlebot Disallow: /*/*/* Allow: / Thanks in advance !!
This code doesn't make much sense for me. I believe it blocks googlebot from crawling ALL directories under level 3 or deeper, and allows crawling all other directories. But I'm not 100% sure.
I am having same problem with blogger.I cant use them to index. Can anybody help me in how can I change the robots.txt file in order to allow my blogs on search engine?
I didn't get my answer yet! Please help me to know this, This code exists at my website and I m thinking to remove it but not sure....can I have the logical answer from any body. Thanks in advance !!
If you want your entire site indexed then either 1) remove the robots.txt file above from your site OR 2) change it so that it contains ONLY the following so that all crawlers will crawl your entire site:
without robot.txt file website can index by crawler. We can mainly use to restrict some files to index by spider.
hello there, this is just a quick explanation of robot.txt, it is some kind of a file that will tell search engine bots what to crawl and what to index. also it is useful if you dont want Search Engine bot to list your downloads in the search engine results. well this is the part where you will use robots.txt anyway if you need more information about robot.txt you can find more here. http://search-engine-optimization-starter.blogspot.com/2009/12/robotstxt-what-is-robotstxt.html i hope this helps. regards;