robots.txt file is a file that is used to tell the search engines, basically which directories / files they should index, and which they should not index and stay out of. http://en.wikipedia.org/wiki/Robots.txt - for some more information for you.
The robots do not behave the same way. Some of them will ignore your robots.txt restrictions even if you forbid them from crawling some of your pages. On the other hand, it is almost useless to use this file to make bots crawl more of your pages. Actually, the crawlers are too active, you don´t have to stimulate them to crawl your site (soemtimes you must restrain them, otherwise they´ll visit you too much and just consume your bandwith). If I understand it well and you want to use it to achieve better crawlability of your site, then an empty robots.txt file is good enough, for a brand new site that has not been crawled yet. For older site, it is not necessary to have a robots.txt file, unless you want to restrict some robots from crawling some parts of your site.
thanks. My site is new, so I'll go for the empty. Are there any "bad robots" I should restrict anyway?
Yes - you can read more about it here http://diveintomark.org/archives/20...pybots_and_tell_unwanted_robots_to_go_to_hell But I've never actually had to do any of that myself. You should read that article, but don't ban every questionable bot like that guy does. Build your site and deal with the robot problems as they arise.