Robot.txt is an instruction file for search engine bots not to crawl(visit) and index a particular page mentioned in file and it will not remove the page. Below are some links for more details about robot.txt:- https://www.hallaminternet.com/the-importance-of-a-robots-txt-file/ https://seositecheckup.com/articles/the-importance-of-a-robotstxt-file-for-your-seo
By using Robot.txt a website owners can control their website from web robot. Website owners can give instruction whose are allow and whose are disallow to visit any page or site.
Its the best tool to disallow private directories/pages listing in Google. You can use it for privacy of your website. But make sure to hide those directories that you need to make private.
A robots.txt file gives instructions to web robots about the pages the website owner doesn’t wish to be crawled. Before a search engine crawls your site, it will look at your robots.txt file as instructions on where they are allowed to crawl (visit) and index (save) on the search engine results. So, simply use the Allow query with website sitemap link and remove the link of the page from sitemap you don't want to show.