Google Sending Me Straight To Hell!

Discussion in 'Google' started by slitmywrists, May 6, 2009.

  1. #1
    I launched a site in late January with about 17,000 very LSI friendly pages featuring all our own funny text. It was about 3 years worth of work.

    Since then we've added about 1,000 pages with consistent regularity. The site is a 3 year old domain and was designed to be SEO optimal in every facet...titling, metas, robots, SEF URLs, etc.

    The day we launched, we had in place a backlinking relationship with a major website (PR 5) -- 10 year old domain and very trusted.

    Massive backlinking campaign was undertaken to get PR.

    Ok, so with the Mar/Apr PR update...tons and tons of our pages had PR...literally hundreds ranging from 2 - 4.

    Traffic was up to 4000 uniques a day. And we expected 20000 based on the revised page rank.

    Instead, about a week later...all our traffic started dying. It took me awhile to realize that ALL of our now 18,000 pages were moved to Supplemental Results.

    Worse, crawl rates and indexed pages thru Webmaster tools are dropping like flies.

    Why would Google reward us with tons of Page Rank and then a week later start eradicating us from the index? I figured it was the new algo.

    We're a totally legit site with relevant content that should show up in the SERPs. We have ridiculous longtail (had).

    According to Webmaster Tools, we now only have 22 pages with backlinks. That number should be well over 10,000.

    Look, I'm fairly savvy about SEO. So please don't respond with some novice suggestions.

    I'm not going to post or talk about the URL. I'm mulling some ideas to try to jumpstart back on the right track.
     
    slitmywrists, May 6, 2009 IP
  2. Camay123

    Camay123 Well-Known Member

    Messages:
    3,423
    Likes Received:
    86
    Best Answers:
    0
    Trophy Points:
    160
    #2
    Maybe the massive backlink campaign raised a red flag.
     
    Camay123, May 6, 2009 IP
  3. slitmywrists

    slitmywrists Peon

    Messages:
    5
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #3
    Yeah, I certainly agree...especially coupled with the domain age.

    Nevertheless, it's weird that we get tons of page rank and a week later start dropping from the SERPs.

    I know that the new algo was trying to weed out blackhatters abusing backlinks. So not sure if we just got caught in that or what.

    We haven't been de-indexed. It just seems that Google has less trust for our pages at this point, even though a bunch of links are from an unbelievably trustworthy source. And it seems that the only way to get out of this mess is more backlinks. Kind of a scary proposition since that may have been the cause.
     
    slitmywrists, May 6, 2009 IP
  4. RightMan

    RightMan Notable Member

    Messages:
    8,294
    Likes Received:
    450
    Best Answers:
    0
    Trophy Points:
    205
    #4
    You surely got good result in the form of PRs for your several pages, due to your huge back linking effort but that seems to have undone your serps.

    Always try to maintain a good balance in your efforts to make them appear legitimate and genuine. Do not go overboard with such activity for it is bound to have negative repercussions !

    Regards,

    RightMan
     
    RightMan, May 6, 2009 IP
    web_hunk likes this.
  5. slitmywrists

    slitmywrists Peon

    Messages:
    5
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #5
    I guess I'm just wondering if the basic issue was blasting onto the scene with 18,000 pages...did that in and of itself cause the algo to barf during the recent index update.

    I can beat myself up about coulda, shoulda, woulda.

    Ultimately, just trying to assess the best way out of this without compounding the problem. We still add tons of content daily, but what's the sound of a tree falling in the woods? Traffic = money, and we can't continue for months and months by being indirectly/inadvertently blackballed.

    I've got PR4 pages that are buried at the bottom of the SERPs. And with oodles of relevant, LSI friendly content.
     
    slitmywrists, May 6, 2009 IP
  6. SuPrAiCeR69

    SuPrAiCeR69 Peon

    Messages:
    216
    Likes Received:
    3
    Best Answers:
    0
    Trophy Points:
    0
    #6
    Without looking into your site it's hard to come to a definitive conclusion although I'll do my best to cross off some obvious solutions.

    First off, how unique is your content? Have you seen copies of your content on other sites which may have been indexed quicker than yours at the time of launch, domains with more trust?
    Have you played around with anything in GWT (specifically geo-targeting if it's a generic tld), on-page or server-end? (go back from a week before and to the day everything started going wrong..
    Have you submitted a request for re-inclusion as it could be a penalty for 'paid links'? Again I cannot be definite as I cannot see your site or the PR5 site you gained links from.
     
    SuPrAiCeR69, May 6, 2009 IP
  7. slitmywrists

    slitmywrists Peon

    Messages:
    5
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #7
    I don't think duplicate content is an issue.

    Everything was trending great from Jan - Mar. Page Rank update came out in the beginning of April and then on April 7 the wheels came off the SERPs.

    Lots of talk in the forums about Google dance at that time, and I was reading the forums about 26x times a day.

    Now it's almost comical how Google is shunning our pages. I don't see the point in submitting for re-inclusion because we haven't been de-indexed. I just assume all our pages are in Supplemental. And according to a Google spokesman, the best way out of supplemental is backlinks.

    I also change our default view from 5 articles to 10, so that will shake up literally everything.
     
    slitmywrists, May 6, 2009 IP
  8. SuPrAiCeR69

    SuPrAiCeR69 Peon

    Messages:
    216
    Likes Received:
    3
    Best Answers:
    0
    Trophy Points:
    0
    #8
    This brings me to my next point, you don't have to be de-indexed to submit a re-consideration request. Loss of rankings could be due to an incurred penalty. Go through Google's guidelines and check everything is still in order. Loss of links and drop in rankings sounds like 'paid' links which Google have now devalued.

    Pay attention to the last 2 headings in the guidelines doc and submit for re-consideration.
     
    SuPrAiCeR69, May 6, 2009 IP
  9. slitmywrists

    slitmywrists Peon

    Messages:
    5
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #9
    If there were issues with any links, you'd think that we would not have gotten Page Ranked for so many pages.

    Submitting for re-inclusion basically says, "Hey, I think I did something wrong, but I'm not sure. So please have a human stick their nose into everything and remove all doubt." Sounds like IRS tax amnesty.
     
    slitmywrists, May 6, 2009 IP
  10. lindamood1

    lindamood1 Active Member

    Messages:
    1,705
    Likes Received:
    5
    Best Answers:
    0
    Trophy Points:
    78
    #10
    u need to check duplicate title, meta tag will put u in supplemental result
     
    lindamood1, May 6, 2009 IP
  11. morkat

    morkat Peon

    Messages:
    208
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #11
    If you do exactly what google guideline asked, you can't achieve anything, the only thing you have to pay attention to is the needs of your visitors.
     
    morkat, May 6, 2009 IP
  12. .SR

    .SR Well-Known Member

    Messages:
    1,089
    Likes Received:
    52
    Best Answers:
    0
    Trophy Points:
    140
    #12
    I am confused about your comment. :confused:

    .SR
     
    .SR, May 6, 2009 IP
  13. Traffic-Bug

    Traffic-Bug Active Member

    Messages:
    1,866
    Likes Received:
    8
    Best Answers:
    0
    Trophy Points:
    80
    #13
    Once Google has lost its trust on your site, it is very difficult to convince google otherwise. The best you can do is to submit your site again to Google and put in a 'reconsideration request' and hope it takes effect.
     
    Traffic-Bug, May 6, 2009 IP
  14. SuPrAiCeR69

    SuPrAiCeR69 Peon

    Messages:
    216
    Likes Received:
    3
    Best Answers:
    0
    Trophy Points:
    0
    #14
    slitmywrists,

    if you're 99% sure about your links, then we'll move on. However, I still cannot understand why you won't reveal your URL. If you come on here for help, be prepared to share your url so users can investigate.

    1. Create XML sitemap - sitemaps.org style XML file.
    2. Add Custom Page Titles. Don't use universal header/footer across the site, including META tags. Create unique TITLE tags.
    3. Add Custom META descriptions. Populate custom META description tags.
    4. Fix 404 Headers. Google might see 404's as legitimate pages (200s). It will start removing bad pages from the index and results should be noticeable within 2 weeks.
    5. Create Data-not-found 404s. Modify non-existent pages to return a 404. Spiders are able to disregard the page.
    6. Add robots.txt. If your site is dynamic, the index may be carrying many duplicates of some pages (e.g. same page, slightly different URL). Purge printable versions of pages (links with "?print=1", for example). Notice results within two weeks, much like the 404s.
    7. Add NOODP, NOYDIR tags. Will help with Yahoo, possibly Google.
    8. Create Shorter, Friendlier URLs.
    9. Reveal More Data to Spiders.
    10. Change Home-page Title. Go through the index, It may occur to you that all top 10 sites are using the same initial word. (e.g. "Funny"). Flip the word-order on the home-page TITLE tag, to shake things up a little more.
     
    SuPrAiCeR69, May 6, 2009 IP
  15. morkat

    morkat Peon

    Messages:
    208
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #15
    I just wanted to say, break the norm or you can't achieve what you want.
    :D
     
    morkat, May 6, 2009 IP
  16. Michaelr

    Michaelr Peon

    Messages:
    535
    Likes Received:
    5
    Best Answers:
    0
    Trophy Points:
    0
    #16
    How did you determine you pages went to a supplementary index?
     
    Michaelr, May 6, 2009 IP
  17. tonsblogger

    tonsblogger Active Member

    Messages:
    330
    Likes Received:
    2
    Best Answers:
    0
    Trophy Points:
    53
    #17
    I believe this guideline is very useful, try it.

    1. Create XML sitemap - sitemaps.org style XML file.
    2. Add Custom Page Titles. Don't use universal header/footer across the site, including META tags. Create unique TITLE tags.
    3. Add Custom META descriptions. Populate custom META description tags.
    4. Fix 404 Headers. Google might see 404's as legitimate pages (200s). It will start removing bad pages from the index and results should be noticeable within 2 weeks.
    5. Create Data-not-found 404s. Modify non-existent pages to return a 404. Spiders are able to disregard the page.
    6. Add robots.txt. If your site is dynamic, the index may be carrying many duplicates of some pages (e.g. same page, slightly different URL). Purge printable versions of pages (links with "?print=1", for example). Notice results within two weeks, much like the 404s.
    7. Add NOODP, NOYDIR tags. Will help with Yahoo, possibly Google.
    8. Create Shorter, Friendlier URLs.
    9. Reveal More Data to Spiders.
    10. Change Home-page Title. Go through the index, It may occur to you that all top 10 sites are using the same initial word. (e.g. "Funny"). Flip the word-order on the home-page TITLE tag, to shake things up a little more.
     
    tonsblogger, May 6, 2009 IP