It all depends on what URL's and paths that you want to restrict and what spiders ect you want to deny, take a look at Robotstxt, the best use for the robots.txt for me would be the ability to include the URL of your sitemap... User-agent: * Disallow: Sitemap: http://www.example.com/sitemap.xml Code (markup):
robot.txt file allows web spiders from search engines to crawl on your site so that to index them properly. Its very important and those who have a problem of indexing know its better use. Rest is mentioned above.