Getting Google To Remove Pages From Index

Discussion in 'Search Engine Optimization' started by bigbluesky2006, Sep 24, 2008.

  1. #1
    I have a site which has around 1200 pages indexed in Google.

    The problem is that many of these pages contain duplicate titles and descriptions. It isn't really feasible to fix all the page titles and descriptions.

    I have created a robots.txt file which excludes all the offending pages.

    Will Google drop these pages from the index now that the robots.txt file excludes them from being spidered?

    I also generated a new sitemap and submitted via Google Webmaster Tools.

    If this won't work can anyone suggest exactly how I should deal with this problem?

    There are only around 200 pages I really want in the index. :confused:
     
    bigbluesky2006, Sep 24, 2008 IP
  2. ashein

    ashein Banned

    Messages:
    71
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #2

    Then why you have created other 1000 pages if you wont like them to be indexed by google.... :eek:

    Robots.txt is a good idea and you can keep them offline too..
     
    ashein, Sep 24, 2008 IP
  3. Monalisha

    Monalisha Guest

    Messages:
    16
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #3
    You can use remove URL option at ur Google Webmaster Tool account. This facility is available inside the tools option.
     
    Monalisha, Sep 24, 2008 IP
  4. bigbluesky2006

    bigbluesky2006 Active Member

    Messages:
    1,234
    Likes Received:
    23
    Best Answers:
    0
    Trophy Points:
    80
    #4
    The problem with the additional 1000 pages is that they duplicate the page title and meta description. I don't want to have to manually edit 1000 pages for obvious reasons.

    The pages simply display different graphics and so there is no need for them to be in the Google index. As long as the first page in each category is in the index that is all that matters as visitors will browse through from the main page until they find the graphic they want.

    I will take a look at the remove URL in GWT to see what options it provides. If I have to sit and enter a 1000 URLs I would imagine that would take ages. Also, do you know how long it takes Google to remove pages?
     
    bigbluesky2006, Sep 25, 2008 IP
  5. Sillysoft

    Sillysoft Active Member

    Messages:
    177
    Likes Received:
    3
    Best Answers:
    0
    Trophy Points:
    58
    #5
    No I dont believe Google will drop those pages from the index. Google will still scrub those pages, they just wont be indexed going forward. I believe thats why they put that option into the webmaster tool, so you can remove them from the index page.

    On a side note having all those pages perhaps you should look into automation of your site? Then updating the meta data should be a lot faster. Thats assuming all the pages are static right now.
     
    Sillysoft, Sep 25, 2008 IP
  6. zurpit.com

    zurpit.com Peon

    Messages:
    1,461
    Likes Received:
    13
    Best Answers:
    0
    Trophy Points:
    0
    #6
    On google webmastertools it says that having duplicate meta titles and descriptions won't penalize you, but if you want to remove them you can do it through google webmastertools
     
    zurpit.com, Sep 25, 2008 IP
  7. catanich

    catanich Peon

    Messages:
    1,921
    Likes Received:
    40
    Best Answers:
    0
    Trophy Points:
    0
    #7
    Verify that your robots.txt file has the pages coded correctly. Goto WMT and go to tools. Select URL Removal Tool. Fill this out. It will take ~ 3 days and they will be gone.
     
    catanich, Sep 25, 2008 IP
  8. SearchBliss

    SearchBliss Well-Known Member

    Messages:
    1,899
    Likes Received:
    70
    Best Answers:
    2
    Trophy Points:
    195
    Digital Goods:
    1
    #8
    Why is there so much duplicate content? Are they dynamic pages?
     
    SearchBliss, Sep 25, 2008 IP