I would like to include articles in my site that would be considered duplicate content by google. Can anyone tell me the BEST way to ensure that google does not spider these links? I had originally thought the using the "rel nofollow" link was the only way to go, but now understand that the "robot.txt" file may be the correct way to accomplish this? I would really appreciate any input!
In my opinion, a robots.txt file is much preferred over 'nofollow' links. If you are not sure how to create a robots.txt file, the page linked below might help. http://en.wikipedia.org/wiki/Robots.txt Mark