Edit Robot.txt for my blog

Discussion in 'robots.txt' started by tarakesh.rao, Mar 19, 2009.

  1. #1
    Hi
    I recently switched my my blog to my new domain
    And added it in google webmaster
    Found that more than 150 pages were not indexed or failed to indexed
    And checked robot.txt
    http://www.blackjackbox.info/robots.txt
    --------------------------------------------------------------------------

    User-agent: Mediapartners-Google
    Disallow:

    User-agent: *
    Disallow: /search

    Sitemap: http://www.blackjackbox.info/feeds/posts/default?orderby=updated

    --------------------------------------------------------------------------
    Found the following
    can i edit this file to allow index the restricted files
    please help
     
    tarakesh.rao, Mar 19, 2009 IP
  2. hans

    hans Well-Known Member

    Messages:
    2,923
    Likes Received:
    126
    Best Answers:
    1
    Trophy Points:
    173
    #2
    1. step
    validate your robots.txt using your Google webmaster account
    see
    http://www.google.com/support/webmasters/bin/answer.py?answer=35237

    2.
    I think your one line

    User-agent: Mediapartners-Google
    Disallow:

    does require a /
    hence
    User-agent: Mediapartners-Google
    Disallow: /

    besides that you exclude nothing - except the folder /search which appears to be identical/duplicate to your domain root content.

    actually i see ony ONE page you have
    all other internal links appear to go into your /search and seem to have all identical content

    to allow Google to index all those identical pages may result in a penalty for duplicate content .... may !
     
    hans, Mar 30, 2009 IP
  3. linkdealer

    linkdealer Active Member

    Messages:
    138
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    90
    #3
    yes.you can edit the robots.txt file
     
    linkdealer, Jun 9, 2009 IP