Am I going to be in trouble with the Google if I use 301 redirects on Googlebot (but not on users) to the original content? I have a genuine need for duplicate content. I'm not trying to spam but host the same content at different domains (with different partners). I can nofollow/noindex the duplicate content but that doesn't tell Google the page that should actually be indexed. What's wrong with taking off the nofollow, leaving the noindex, but serving a 301 redirect to the IP of only the Googlebot? Thanks
Which also could be called "cloaking". Personally, I find cloaking a necessity for some clients. What about robots.txt file? You could make it so the page isn't crawled. In essence, you could do what you're suggesting. Sounds like a lot of work, especially if you're going to setup on several different domains, but if you must, you must. And really, you'll probably get a little link juice from having a few 301's being sent to the original content piece. Another solution could be to put the content on a page that won't get indexed, by using AJAX or Javascript. That's an alternative.