any one tell me how can i my website url and pages with the help of robots.txt file because of duplication of website url if any one know then sharing your knowledge with, so can resolve my problem thanks.........
Block or remove pages using a robots.txt file A robots.txt file restricts access to your site by search engine robots that crawl the web. These bots are automated, and before they access pages of a site, they check to see if a robots.txt file exists that prevents them from accessing certain pages. (All respectable robots will respect the directives in a robots.txt file, although some may interpret them differently. However, a robots.txt is not enforceable, and some spammers and other troublemakers may ignore it. For this reason, we recommend password protecting confidential information.)
If you have duplicate pages in your website and want to block them through robots.txt then add the following code in your robots.txt file: User-agent: * disallow: /filepath