I want to know, what is the purpose of robots.txt file in websites, how we can use it and can get benefits of it.
Robots.txt is basically a setting or config file for web crawler to craw your site. Please read: http://en.wikipedia.org/wiki/Robots.txt
Robots.txt is basically used to control search engines crawlers or any other crawlers to visit private sections of your website that you do not want to show them.
From robot.txt file you can specify the crawlers about accessing your website. Like which folder/file should be get index and more. This article will be helpful for you. Create Robot.txt File
robots.txt file helps the search engine bots to crawl your website and helps in indexing your websites easily. There are many bad robots also it is recommended to use robots.txt and mention the folders to disallow from indexing and crawling.
Robots.txt file is used to tell the search engine which files and folders to be crawled/indexed and which should not be crawled/indexed.always Place the robots.txt file on the root folder/level.