Advice about robots.txt and duplicate content

Discussion in 'robots.txt' started by itsallwhite, Aug 23, 2009.

  1. #1
    I have a site which consists of a wordpress front end with 3 zencart stores.

    The wordpress store is completely unique, but the zencart stores have duplicate content (they sell the same products but in different countries).

    I would like to allow the wordpress pages and 1 zencart store but restrict the others, the problem is all 3 stores have been indexed by Google. Maybe I should just allow the wordpress pages?

    My current robots.txt file is below - is this correct?

    How do I get my pages removed from Google - I think this might be affecting my rankings?

    User-agent: *
    Disallow: /cgi-bin/
    Disallow: /uk_store/
    Disallow: /usa_store/
    Code (markup):
    My site URL is here
     
    itsallwhite, Aug 23, 2009 IP