How to stop crawling of multi language site

Discussion in 'robots.txt' started by hvwebseo, Jun 5, 2012.

  1. #1
    Hi

    I have a website in multiple language. But here google crawler index my all language pages. And google webmasters give me error for duplicate content or data.

    How can I crawl my English version site only.
    e.g.

    google crawl

    abc.com
    abc.com/en
    abc.com/jp
    abc.com/de
    abc.com/ru

    All language webpage. But those pages are dynamic. Any suggestion welcome.

    Thanks
     
    hvwebseo, Jun 5, 2012 IP
  2. RoseHosting

    RoseHosting Well-Known Member

    Messages:
    230
    Likes Received:
    11
    Best Answers:
    11
    Trophy Points:
    138
    #2
    Create a /robots.txt file with the following content:

    User-Agent: *
    Disallow: 
    Disallow: /de
    Disallow: /jp
    Disallow: /ru
    Code (markup):
     
    RoseHosting, Jun 5, 2012 IP