As we all knew about robots.txt file , which is a text file we include to instruct the search engine to crawl our webpages. So, how to write , create and what code is required to generate this file. All suggestions would be very welcome !!
No, it's not required to tell it to crawl. It's required to tell it where to not crawl. You're unclear about what you want. You should read http://www.robotstxt.org/ and learn how to do this. It's not difficult, even someone stupid can do it.
creating and using robots.txt depends only on your requirements. It is created to restrict crawlers from accessing certain parts of your blog which you dont want in search engines.