Robots.txt is uploaded at root file.it is using for secure purpose,if your going to write some code with notepad.just inform to search engine crawler, please crawl my mention files and folders only like.
robot.txt is a file that guides search engine spiders that which page is to crawl and which not. Hope this help!
Robots.txt is a permissions file that can be used to control which webpages of a website a search engine indexes. The file must be located in the root directory of the website for a search engine website-indexing program (spider) to reference.Robots.txt is a text file present in the root directory of a website. The Robots.txt file is a convention created to direct the activity of search engine crawlers or web spiders.
The Robots.txt file is a convention created to direct the activity of search engine crawlers or web spiders. The file tells the search engine crawlers which parts to web and which parts to leave alone in a website, differing between what is viewable to the public and what is viewable to the creators of the website alone. A Robots.txt file is frequently used by search engines to categorize and archive web pages, or by webmasters to proofread source codes.