According to Google; A text file located in the root directory of your web site written to instruct search engine robots and spiders where they are allowed to crawl. robots.txt files can be bot specific or written to include all robots who follow robots.txt files. More >>
Have a read of http://www.robotstxt.org/ It is basically a way to tell search engine bots what it should NOT crawl.
I have this on one of my sites.. User-agent: * Disallow: /data/my.css Disallow: /images/ Code (markup):
Some time ago I wrote a little robots.txt tutorial that you may find useful. This is the link: Rauru.com | Why you should have a robots.txt in your sites Hope that helps
http://www.robotstxt.org/ is a very useful site. A quick google into robots.txt would have given you the answer.
User-agent: * = all bots Disallow: /data/my.css = do not crawl/index myfile (in this case its a css file) Disallow: /images/ = do not crawl/index my images folder and its contents.
A robots.txt file is used to inform web crawlers/ spiders which files or folders they should or should not index.