How risky is user agent delivery? I know its a form of cloaking but how risky is it? Here's what I mean. You have some content you want spidered and indexed to generate traffic but you don't want to force users to register to be able to read it. So you set up .htaccess/mod_reqwrite to redirect a user agent such as Google to the article but a regular user to a registraion page and prevent Google from caching those pages. WMW does something similar. Worth the risk or not? I don't think its worth the risk!
create robots.txt --------------------------- User-Agent: * Disallow: /registration.html --------------------------- and put it in your root folder /robots.txt find out more about robots.txt in robotstxt.org or you can use this instead <a href="registration.html" rel="nofollow">go to registration < / a> and the robots won't follow the link