Yes, robots.txt file is a small file (like a notepad for example) where you write only this: "User-agent: *" on the notepad and then svae the file as robots.txt and upload it to your root directory. This will tell the search engine crawlers that they are free to crawl through all your pages.
As WestJohn pointed out. robots.txt tells the search engine spiders which pages they are allowed access to which are forbidden. You can also use robots.txt to ban certain malicious bots from crawling your site to begin with, plus a few other things that aren't that important. Sitemap.xml on the other hand, is basically a list of all your web pages so the search engine can find all of your web pages in one location. This means that your site may 'potentially' be indexed faster. You can use sitemap generators for creating these. Your robots.txt and sitemap.xml have to be in your root directory. So that the url is example.com/robots.txt & example.com/sitemap.xml. If you do not have them like that, then the search engine will not find them. All the Best
the function of robots and sitemap is for getting index page from google more deeply. You can create sitemap and robots by using online tool. You can searching information about that on google