Incorrect Robots.txt File Resulting in Google Indexing Mess!

Discussion in 'Search Engine Optimization' started by Think-Big, Feb 12, 2009.

  1. #1
    Hello All,

    I am using a Wordpress blog that I'm hosting and I found out I submitted an incorrect Robots.txt file which "Allowed" everything to be indexed, well instead of indexing the post pages, Google is indexing the "tag" pages see: site:http://www.dealmakersblog.com. As you can see most of the indexed pages are "tag" pages with bad SEO.

    I fixed the Robot.txt by disallowing the pages I don't want indexed and pinged the site.

    Question: 1. Will Google reindex the site to the newly updated Robots.txt file?
    2. What can I do to the 30+ posts that I have already submitted?
    3. What can I do to speed the process?

    Thanks for all you help!!!
     
    Think-Big, Feb 12, 2009 IP
  2. proxywhereabouts

    proxywhereabouts Notable Member

    Messages:
    4,027
    Likes Received:
    110
    Best Answers:
    0
    Trophy Points:
    200
    #2
    I think you can email or contact google to exlude your site and it will be reindexed.
    Try looking for this option in google webmaster tool.
     
    proxywhereabouts, Feb 12, 2009 IP
  3. jitendraag

    jitendraag Notable Member

    Messages:
    3,982
    Likes Received:
    324
    Best Answers:
    1
    Trophy Points:
    270
    #3
    You can use webmaster tools to easily remove individual pages or get whole site deindexed. Your posts will be indexed as well, just give google some time :)
     
    jitendraag, Feb 12, 2009 IP