If you want to restrict some of the pages of your website from appearing in Google search then write them down in the robots.txt and Google won't be able to index those.
Got an admin area / or other secured area in your website and wish not to list it in Search engines like Google, Yahoo, MSN ? Just make a robot.txt and place the code to disallow .
robots.txt can block indexability to certain pages of your site so you should know what you're doing before you mess with it. also a good idea to check it if one of your plugins is blocking indexing of your pages.
Importance are follows, It gives you more control over search engines movement on your website. While a single accidental disallow instruction can cause Googlebot from crawling your entire site, there are some common cases where it can really be handy. Prevents server overload. Prevents sensitive information from getting exposed. Prevents crawl budget from getting wasted. Prevents crawling of duplicate content Prevents indexing of unnecessary files on your website (e.g. images, video, PDFs). Helps to keep sections of your website private (e.g. staging site). Prevents crawling for internal search results pages. cheers !!!