I wish the robot can climb all of my web,and there are no secrets in my web. And wish the SE can site most pages of my web. Does it will be better for me to list all my pagers in robots.txt? Or just write "*" or "all"?
We write all pages in "robots.txt", but It is not necessary. This file is use, which page of our website we disallow to crawling by Search Engines "robots".
robots.txt is mainly if you need disallow access to any crawler. Not allow. If you want to have all sites listed, create sitemap and put a link in robots.txt ... in this case robots.txt should content only this line: Sitemap: http://www.example.com/sitemap.xml Code (markup): How to create sitemap read sitemaps official site: http://www.sitemaps.org But it's not needed if you have normal html sitemap and/or well linked inner pages (internal links)