Need to Get Wordpress Pages Deindexed like site.com/page/2

Discussion in 'WordPress' started by jacky8, Jul 6, 2009.

  1. #1
    Please help.

    I saw today, i have like 150 pages indexed in google when i search
    site:mysitename.com/page

    I need to get them all deindexed. Please let me know what php code to place and where to place to get all such pages deindexed from google and remove the duplicate content issues.
    Also can i place these directly in google webmaster tools?

    What other codes should a wordpress blog use to get such things deindexed from google? Pls help

    Thanks..
     
    jacky8, Jul 6, 2009 IP
  2. PressGuy

    PressGuy Banned

    Messages:
    247
    Likes Received:
    8
    Best Answers:
    0
    Trophy Points:
    0
    #2
    you need to put a request in webmaster tools for them to be de-indexed

    then install a wordpress plugin that prevents duplicate content
     
    PressGuy, Jul 6, 2009 IP
  3. jacky8

    jacky8 Active Member

    Messages:
    1,416
    Likes Received:
    23
    Best Answers:
    0
    Trophy Points:
    80
    #3
    How can i put the request?

    I wish there was a paid Google Customer Support. I would be very glad to pay them to sort out such issues and for certain queries...
     
    jacky8, Jul 6, 2009 IP
  4. ~kev~

    ~kev~ Well-Known Member

    Messages:
    2,866
    Likes Received:
    194
    Best Answers:
    0
    Trophy Points:
    110
    #4
    Add a line of code to your robots.txt file and block the pages from being indexed. But,, it could take 2 - 3 weeks for google to finally remove those pages from the search results.

    If you are submitting a sitemap, be sure those pages are not being included in the sitemap.
     
    ~kev~, Jul 6, 2009 IP
  5. jacky8

    jacky8 Active Member

    Messages:
    1,416
    Likes Received:
    23
    Best Answers:
    0
    Trophy Points:
    80
    #5
    what code should i add into robots.txt if i want to remove all pages mysite.com/page/1
    There are hundreds of such pages. do i need to put the code in google webmaster tools or elsewhere?
    sorry, i am a bit ignorant about such technical stuff.
     
    jacky8, Jul 9, 2009 IP
  6. Kerosene

    Kerosene Alpha & Omega™ Staff

    Messages:
    11,366
    Likes Received:
    575
    Best Answers:
    4
    Trophy Points:
    385
    #6
    This is the default robots.txt file I use on most of my WP powered sites.
    It seems to work for me.

    The line is bold is what you need.

    User-agent: Googlebot
    Disallow: /*/feed/$
    Disallow: /*/feed/rss/$
    Disallow: /*/trackback/$
    
    User-agent: *
    Disallow: /wp-
    Disallow: /feed/
    Disallow: /trackback/
    Disallow: /rss/
    Disallow: /comments/feed/
    [B]Disallow: /page/[/B]
    Disallow: /date/
    
    Code (markup):
    If all you want to do is block pages, then try:

    User-agent: *
    Disallow: /page/
    Code (markup):
    I have this running on several of my *high ranking* sites, and it seems to work fine. BUT if anyone sees any issues with it, PLEASE let me know! :p
     
    Kerosene, Jul 9, 2009 IP
  7. jacky8

    jacky8 Active Member

    Messages:
    1,416
    Likes Received:
    23
    Best Answers:
    0
    Trophy Points:
    80
    #7
    Thanks a lot kerosene. That really helped. I have saved the robots.txt file and will use it on my blogs.

    I havn't seen this before site.com/date/
    Is it about archive pages?


    Also will this work fine?
    User-Agent: Googlebot
    Disallow: /page/
    Allow: /
     
    jacky8, Jul 9, 2009 IP
  8. websol

    websol Active Member

    Messages:
    45
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    93
    #8
    Hi

    I am confused. Why do you want to de-index the page, when the world is dying to get their pages indexed?

    Prasaad
     
    websol, Jul 30, 2009 IP
  9. Designstrike

    Designstrike Peon

    Messages:
    110
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    0
    #9
    He said its because of duplicate content. You can see what duplicate contents you have in the GWT account, so check for it often.
     
    Designstrike, Jul 31, 2009 IP