Greetings, Looking for a bit of help. We have lost most of our Google traffic since Feb. 2nd. To add injury to insult - we lost another portion of it in the update that is going on right now. It is really frustrating. Our site has been online since 1999 and has enjoyed good placements in the SERPs. Yahoo and MSN still love our site. On Yahoo and MSN we rank in the top ten for terms we rank 800 or greater on Google. Here is my question. The only thing I can think wrong with out site is possible link pages we have created for our articles. We have over 20,000 articles on the site. We have pages that just list the articles in alphabetical order. There is not much more on the page. Each page has less than 100 links. In all, there are about 350 of these link pages. Could these be causing a penalty of some type? We also have other pages that list the articles with a short description. These are not in alpha order. There is about 700 of these pages. All of articles get spidered and indexed with no problem. Should we get rid of the alphabetical pages? Could this be causing a penalty due to only containing links? Looking for some expert opinion. I am wondering if this is causing a penalty because of to much repatition of the same anchor text. For each article, it will have the following pages linking in: - A page with a short description of the article with title of article. - An alphabetical listing of the article - just the title. - Articles of the same subject are also linked together. Therefore, in the naviagation of the article will be links out to other articles. Therefore, it is possible that the anchor text for each article is repeated three times. Any thoughts? Should we remove the alphabetical pages? These would be the ones we could do without. Thanks.
I'd go up to 250 links per page now, not 100. I'd go with the version that has the short descriptions and if possible group them on a theme basis not abc's. Have you considered a tree structure?
Dominic, Thanks for reply. What is the tree structure? I will Google it. I should probably get it down to one way to link everything together. Thanks.
THis is a tree structure : http://www.xperts.ro/temp/tree.JPG Sorry for the bad drwings. Made it in paint in 20 seconds. Recently had Corel problems.
It's possibly not even a link structure issue - two things comes to mind immediately: 1. Do you have a good spread of decent backlinks to the site? 2. Is your content actually unique and useful?
I, Brian - The site has all natural backlinks in Google. We have never purchased a link. We have many links from good sources: CNN, About.com, WIKIPEDIA, PCWorld, DMOZ, Yahoo, etc... I think the current count is 2740. All of the content is unique. We have spent a great deal of money locating content and moving it to the web. Nothing system generated. I will check the tree structure... I appreciate everyones thoughts and ideas. Thanks.
I don't think you have done anything wrong. You are not unique in having this problem. I'm of the firm opinion that Google is badly messed up and broken. Everyone seem to still think that there is some grand plan or master intelligence in everything Google does. But the sign of any grand plan is consistency and reproducibility. Google fails on both counts at the moment. Don't change a thing! There is no guarantee that it will make any difference in your Google placement and it just might screw up your Yahoo and MSN placement.
What other possible explanation or conclusion can one come to? There is no logic. Their results change wildly from day to day. Their datacenters are frequently miles out of sync for days at a time. Their reported backlinks and cached pages vascillate between current dates and old dates. I have a small content site with 128 pages. They have reported as high as 115 pages but vascillate daily between highs and lows of 50 to 115. See McDar's daily reports for evidence of this. The only conclusion one can draw from this is that they are so broken they can't even count. Now the Google apologists say that they still deliver relevant SERPs, but there are millions of pages out there that are relevant for most search or subjects. So they would have trouble not delivering relevant pages. But in my opinion they are not offering the most relevant nor with their sandbox trick the most current information on any subject. I think it is totally irrational to justify or explain the current results as being an anti SEO campaign. Even if it was anti SEO we should still see consistency from day to day. If they don't like a site because they think it has been over optimized then demote it or remove it from the SERP, but don't jerk it back and forth from one moment to the next like they are doing. My current SEO strategy is to totally ignore Google. I'm doing what I think will get my sites recognized by MSN and Yahoo -- and with some remarkable results -- but I'm totally ignoring Google until I see some evidence of them being able to produce consistent and reproducible results.
Compar - I think you are correct.... I am moving to this camp. We tried all the stuff mentioned as possible issues: - 301 redirect from non-www to www. - Contact sites to remove 302 redirects pointing to our site. - checked all relative urls - etc... Nothing has made a dif. Realitcally speaking - the average person who puts up a website has no idea about 301 - 302 - relative vs non-relative - etc... They put together a page in front page and find the cheapest web hosting on a shared ip address - and they rank! I am figuring out that there is really nothing you can do. I am going to spending more time on figuring out what Yahoo and MSN like.