Dealing with duplicate content with redirects

Discussion in 'Search Engine Optimization' started by seobrien, Aug 30, 2007.

  1. #1
    Am I going to be in trouble with the Google if I use 301 redirects on Googlebot (but not on users) to the original content?

    I have a genuine need for duplicate content. I'm not trying to spam but host the same content at different domains (with different partners). I can nofollow/noindex the duplicate content but that doesn't tell Google the page that should actually be indexed.

    What's wrong with taking off the nofollow, leaving the noindex, but serving a 301 redirect to the IP of only the Googlebot?

    Thanks
     
    seobrien, Aug 30, 2007 IP
  2. Voasi

    Voasi Active Member

    Messages:
    1,054
    Likes Received:
    43
    Best Answers:
    0
    Trophy Points:
    88
    #2
    Which also could be called "cloaking". :) Personally, I find cloaking a necessity for some clients.

    What about robots.txt file? You could make it so the page isn't crawled.

    In essence, you could do what you're suggesting. Sounds like a lot of work, especially if you're going to setup on several different domains, but if you must, you must. And really, you'll probably get a little link juice from having a few 301's being sent to the original content piece.

    Another solution could be to put the content on a page that won't get indexed, by using AJAX or Javascript. That's an alternative.
     
    Voasi, Aug 30, 2007 IP