HUGE chunk of a site lost any PR/SERPs. What would you do?

Discussion in 'Websites' started by 1-script.com, Jan 28, 2006.

  1. #1
    Hello everyone,
    Hope gods of Google treat your sites today better than mine.

    A huge 500K+ pages chunk of one of my sites lost any PR and, consequently, SERPs last night. I guess, that does not qualify as a PR update - you can loose PR any day of a week, but you can only gain PR when it really is an update.

    Anyways, the rest of the site, including the homepage as well as some other sections, did not change; only this section that happens to be a forum got demoted this way. Coincidentally (or not) this was the only part of the site included in the Google sitemap for that site due to some programming issues. The pages of the section are still indexed in Google, they just became PR0 and got pushed off reasonably high SERPs. I did have PR0 pages before in that section that showed up on the first page of the SERPs, but not anymore.

    I am open to any suggestion about what can possibly be done to improve the situation as this section represents about 3/4 of the site's size as well as traffic and revenue.

    Would you go to such extreme as to re-name the directory the section is in thus changing the URLs in hopes that G will re-discover and re-evaluate the section of the site?

    In any event, how do you go about finding out exactly what happened? A direct inquiry to G seems just silly since they are not going to answer any webmaster's individual requests. At least none of the few I sent in the past came back answered. If that was the whole site, I would consider it banned. What do you call a “partial ban” like this?

    Had anyone gotten a site's PR "repaired" in a situation like this?

    Once again, any suggestion is greatly appreciated.

    Cheers!
     
    1-script.com, Jan 28, 2006 IP
  2. enQuira

    enQuira Peon

    Messages:
    1,584
    Likes Received:
    250
    Best Answers:
    0
    Trophy Points:
    0
    #2
    When did this happen? It can be a temporary issue
     
    enQuira, Jan 28, 2006 IP
  3. Smyrl

    Smyrl Tomato Republic Staff

    Messages:
    13,740
    Likes Received:
    1,702
    Best Answers:
    78
    Trophy Points:
    510
    #3
    I would sit tight and not do a thing until after the next pagerank backlink update.

    Good luck.

    Shannon
     
    Smyrl, Jan 28, 2006 IP
  4. 1-script.com

    1-script.com Well-Known Member

    Messages:
    805
    Likes Received:
    46
    Best Answers:
    0
    Trophy Points:
    120
    #4
    I certainly hope it IS temporary. It's been happening since around 18:00 Pacific 01/27/2006 so it's been a full day. Still, I'd want to react on this pretty quickly, so any troubleshooting/rectification suggestions are appreciated even it will turn out to be temporary. Also, I would have to admit I never had a temporary issue of this magnitude before, so I'm a bit skeptical about this being a temporary problem.
     
    1-script.com, Jan 28, 2006 IP
  5. Sharpseo

    Sharpseo Peon

    Messages:
    653
    Likes Received:
    52
    Best Answers:
    0
    Trophy Points:
    0
    #5
    Is the content all original? It seems like Google is heavily refining their duplicate content filters.
     
    Sharpseo, Jan 28, 2006 IP
  6. Smyrl

    Smyrl Tomato Republic Staff

    Messages:
    13,740
    Likes Received:
    1,702
    Best Answers:
    78
    Trophy Points:
    510
    #6
    I have had a site that has been stable since 2002 disappear on certain data centers. I am waiting as I suggested above. Have you used McDar's tool to see if you are still indexed on some datacenters?

    http://www.mcdar.net/dance/index.php

    Is it indexed on BigDaddy datacenters?

    Shannon
     
    Smyrl, Jan 28, 2006 IP
  7. 1-script.com

    1-script.com Well-Known Member

    Messages:
    805
    Likes Received:
    46
    Best Answers:
    0
    Trophy Points:
    120
    #7
    Well, out of 500K pages something was original, and something was not. The thing is - the entire section is out.

    By the way, should I suspect a 302 redirect hijacking?
     
    1-script.com, Jan 28, 2006 IP
  8. nddb

    nddb Peon

    Messages:
    803
    Likes Received:
    30
    Best Answers:
    0
    Trophy Points:
    0
    #8
    I've lost almost that many pages from the index fairly recently. If you didn't change anything to cause, besides MAYBE doing some minor tuning, I wouldn't touch it. But that's just me... googlebot is like a frightened animal, it seems to run away and take a while to come back, just mho
     
    nddb, Jan 28, 2006 IP
  9. 1-script.com

    1-script.com Well-Known Member

    Messages:
    805
    Likes Received:
    46
    Best Answers:
    0
    Trophy Points:
    120
    #9

    http://64.233.179.104 shows 16,000+ pages for the entire site whereas all others I checked so far show 1,100,000 pages. I actually don't have this many pages, it should be more like 700,000 but who knows how do they count, especially after that shout-out with Yahoo about "who's (index) is bigger" back in August!

    Also, running an allinurl: query on the section in question yields no results even though they still have cache. I obviously did not check all 500,000+ pages but the main page of the section as well as other major pages and randomly picked individual pages all have cache.
     
    1-script.com, Jan 28, 2006 IP
  10. classifieds

    classifieds Sopchoppy Flash

    Messages:
    825
    Likes Received:
    51
    Best Answers:
    0
    Trophy Points:
    150
    #10
    Do the results from site: show any of your pages in the supplemental index?

    Also try this:
    Normal site:
    http://www.google.com/search?q=site%3Awww.yourdomainhere.com&start=0&ie=utf-8&oe=utf-8&client=firefox-a&rls=org.mozilla:en-US:official
    Code (markup):
    Then do this:
    Site: from the Google Alert
    http://www.google.com/search?ie=utf8&oe=utf8&num=10&q=site:www.yourdomainhere.com&lr=lang_en
    Code (markup):
    The second site command will usually show you a lower number which does not include many of the pages that are in the supplemental Db.


    I had this exact same thing happen to one of my sites during Jagger (650k pages moved into the supplemental Db) and there were at *least* two causes: 1) near duplicate pages, and 2) excessive internal linking with keyword stuffed URLs.

    I fixed all the problems that I could find and then submitted a reinclusion request.

    This particular site is just now beginning to show back up in the serps, the indexed pages are on the rise again, and most of the pages have more than PR0.

    Hopefully in your case it’s just a Db hiccup.
     
    classifieds, Jan 29, 2006 IP
    1-script.com likes this.
  11. idolw

    idolw Peon

    Messages:
    158
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #11
    idolw, Jan 29, 2006 IP
  12. 1-script.com

    1-script.com Well-Known Member

    Messages:
    805
    Likes Received:
    46
    Best Answers:
    0
    Trophy Points:
    120
    #12
    I think excessive interlinking is the problem. I had a system whereby a keyword would get converted into a link to a message about same subject. Similar to what Shawn is doing here at the bottom of the page. However, the anchor text was just the keyword (phrase) itself, nothing more. I guess you can call it stuffing if there is nothing more than the keyword itself in the link. This actually brings up an important question: "how to you cross-promote your pages without tripping the 'excessive internal linking' filter"? I guess, it is a very important question that deserves its own thread here, so I will research around and maybe start a new one.

    Another important Q: how do you request re-inclusion of a part of a site, not a whole site? Use the same form?
     
    1-script.com, Jan 29, 2006 IP
  13. mad4

    mad4 Peon

    Messages:
    6,986
    Likes Received:
    493
    Best Answers:
    0
    Trophy Points:
    0
    #13
    I assume that all your pages have different titles and h1 tags?

    This may seem basic but I know of several sites which are getting less and less pages indexed in G despite adding more and more pages just because the pages are being classed as too similar.
     
    mad4, Jan 29, 2006 IP