Best way to stop search engines from spidering certain links...

Discussion in 'Search Engine Optimization' started by apdfranklin, Jun 18, 2007.

  1. #1
    I would like to include articles in my site that would be considered duplicate content by google. Can anyone tell me the BEST way to ensure that google does not spider these links?

    I had originally thought the using the "rel nofollow" link was the only way to go, but now understand that the "robot.txt" file may be the correct way to accomplish this?

    I would really appreciate any input!
     
    apdfranklin, Jun 18, 2007 IP
  2. Raz2133

    Raz2133 Peon

    Messages:
    65
    Likes Received:
    5
    Best Answers:
    0
    Trophy Points:
    0
    #2
    In my opinion, a robots.txt file is much preferred over 'nofollow' links.

    If you are not sure how to create a robots.txt file, the page linked below might help.

    http://en.wikipedia.org/wiki/Robots.txt

    Mark
     
    Raz2133, Jun 18, 2007 IP