Quick question for anyone that is more familiar with robots.txt than I am. Let's say I wanted to only allow a few spiders, but give the ones I do allow a few specific disallows. Would this work? User-agent: Googlebot User-agent: Slurp User-agent: msnbot User-agent: askjeeves Disallow: /images/ Disallow: /purchase.php Disallow: /go.php User-agent: * Disallow: / Code (markup):
I think so, look a lot like the one in the article shere http://www.search-marketing.info/meta-tags/robots-meta.htm
Yeah, as a FYI, it seems to be working... lame bots are not continuing after reading the robots.txt file, but so far only one of the allowed bots has visited (Yahoo), and it continued spidering after reading the robots.txt file.