hello friends, tell me plz why we use robot.txt......n how can i use this on my web page..... Thanks..........
robots.txt is there to stop the google spiders from indexing parts of your page, say you have www.example.com/index/adminlogin.php an google had indexed that page an it shows up in the results page of a search it looks A) unprofessional an B) leaves your site vulnerable to attack. The robots.txt file will say to google, yahoo, msn bots index www.example.com/index.htm but don't index www.exapmle.com/index/adminlogin.php
Just open up Google, search for robots.txt and voilá: http://www.robotstxt.org/orig.html 5 seconds including typing this post.
you can use robots.txt if you want certain spiders or crawlers to access your site or you can also use this to block some directories, files from your site to get index or even your whole site
Its very good to use these if you sell on Clickbank to mask / hide your download page as its very easy to get CB products for free from poor sites not using Robots TXT properly.