Isnt disallowing bots to crawl your website a bad thing? Or do people do it to conserve bandwidth? I have a forum and my robots.txt is empty, causing no problem, of course a lot of bandwidth gets eaten but I can afford that.
If people don't want their images to be listed in google for example, they disallow the images/ directory. I know many people disallow their cgi-bin too.
ah thats a good idea. i always wondered how i could disallow my images for image search's. if i was to disallow my images folder, would tha totally stop my images from appearing in image search engines or not?
. www.robotstxt.org/ The Web Robots Pages - www.robotstxt.org/wc/robots.html The Web Robots FAQ - www.robotstxt.org/wc/faq.html Robots Exclusion - www.robotstxt.org/wc/exclusion.html A Standard for Robot Exclusion - www.robotstxt.org/wc/norobots.html
It actually depends on your purpose, I have had a lot of sites made, for internal training purposes, so there is no need for search engines to crawl them.