It's basically a file that tells search engine bots which parts of your website you don't want to be crawled. If you want your entire website to be crawled then you don't really need one.
In simple word, if you want to keep files or folders away from the search engine crawler, you write all the instructions in robots.txt file.
www.robotstxt.org will have all the information you need to get started. I strongly suggest you read it.
Hello Friend, Robot file is invite to all search engine for our site so we call the all search engine for crawling for our site.
This is 100% incorrect!!! Robots.txt is used to BLOCK search engines and spiders from seeing certain parts of the site that you don't want indexed!