The robots.txt is a text file that resides in your root directory. Its primary purpose is to limit search engine spiders as to what files and directories they can crawl. For example if you have an admin subdirectory that you do not want the spiders to crawl you can limit it out. You may also (attempt) to block a certain spider/bot from crawling any of your site. I say attempt because not all spiders will 'obey' the robots.txt file. The robots.txt is not required, but it is a very good idea to have it there, especially if you have subdirectories or files you do not want indexed.
I wouldn't use it to block the admin directory - I'd use a password protected directory structure followed by the standard HTML based password prompt instead. Then again, I'm not fond of using /admin/ for my admin area anyway. For more information visit www.robotstxt.org
you can check http://forums.digitalpoint.com/showthread.php?t=451003 . that guys is also asking the same question with you
This is a very good explanation. I only want to add that another reason why you should have it is that you can direct search engine crawlers to your sitemap with it.
So if you add the url of some of your pages it the robots.txt, it will be crawled? And if you don't have anything in that file, it means that your page will be not crawled?
You don't link to your pages from robots.txt, but you can link to your sitemap. The format to use is Sitemap: http:// yourwebsite.com/sitemap.xml Note that full URL is required. For more information on sitemaps with robots.txt, see this post.