Robots.txt file for Blog Page

Discussion in 'Search Engine Optimization' started by monalikedare, Jun 28, 2013.

  1. #1
    Hello,
    My blog is hosted on HubSpot CMS. My blog url is like http://blog.example.com. There are some duplicate content issues for tag pages.
    e.g.
    http://blog.example.com/blog/?Tag=Quality
    http://blog.example.com/blog/?Tag=HIE

    HubSpot CMS is incompetent for Canonical tag, hence I have tried to block these pages into robots.txt:
    User-agent: *
    Disallow: /blog/?Tag=Quality
    Disallow: /blog/?Tag=HIE
    But it is not giving any positive result.

    Please tell me how can get a rid of these pages?
    Thanks in advance for your help.
     
    monalikedare, Jun 28, 2013 IP
  2. iony

    iony Greenhorn

    Messages:
    29
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    11
    #2
    Why you need to block these pages? Just let them as they are. Google knows better what they are and if they are duplicate
     
    iony, Jun 28, 2013 IP