Just select NO because blogger already made the robots.txt settings which are widely recommended and best fits for any blogger powered blogs. And i don't know why you want to customize the robots.txt file when there is a recommended settings applied by default. I would suggest you to go through with the link which is all about robots.txt thing which is explained by the Google officially: http://support.google.com/webmasters/bin/answer.py?hl=en&answer=156449&from=35237&rd=1
In order to allow all robots access just enter User-agent: * Disallow: Sitemap: http://www.domainname.com If you are new to robots.txt don't play with the custom field, leave it as it is and as jose mentioned do look at the webmaster link for more information on robots.txt