hi all, plz reply my question.... I got a site of my friend. I new one. and he asked me to use robo.txt file on home page. I dont know how to create robot.txt file and how to use that. Plz anyone tell me how can i do that It would be greatful to all of you. waiting for reply Thanks
You can create you own robots.txt file using notepad. Just open an new text file and paste the following: User-agent: * Disallow: Then save the file as robots.txt and upload it to the root of your website. This will allow all website spiders to crawl all your pages. If you want to limit that then have a look at robotstxt dot org/robotstxt.html for more information.
You can find a comprehensive guide at Google: http://www.google.com/support/webmasters/bin/answer.py?hl=en&answer=40360
A robots.txt is a permissions file that can be used to control which webpages of a website a search engine indexes. If you have some private files or folders in your website and you don't want to show those files in search engine result pages, you can exclude those pages using robots.txt like: User-agent: * Disallow: /your private file or folder