Like many webmasters, I am unhappy with the effects to date of the Jagger update on my site's rankings. I am planning to submit an idea to Google for a screen that would give webmasters feedback on the reasons for their good/not so good ranking for sites. I previously posted this idea as a comment to one of Matt Cutts blog entries (http://www.mattcutts.com/blog/more-info-on-updates/), but I have decided to spread the idea around a few other places to gauge reaction and suggestions. I will then submit the idea to Google through their feedback mechanism, just to see what, if any, reaction there will be...... ************************** Before I start, let me say that I would not have to write this but for the massive success of Google as a search engine. In my own case, about 80% of SE traffic comes from Google, or at least it used to... I have concerns about the way in which this latest Google update is being carried out. First of all,the secrecy about what is being done. If Google are hitting recip links heavily, or text ads (e.g. Co-op) or link exchanges (e.g. Link Vault), why don’t they simply come out and say so, officially? In my opinion, the secrecy surrounding the changes, far from helping legit sites, actually plays into the hands of Black Hat SEO’s. They can now say, “well, you tried honest SEO, and look where it got you. Come over to our side and let us show you how it’s done.†The secrecy surrounding this update will help to enrich Black Hat SEO’s who promise short term solutions to website ranking problems. Secondly, I don’t agree in principle with the idea of nailing webmasters who openly participate in schemes like co-op (if this is what is being done). We all know that Google seems to like links contained within a paragraph of text, but how difficult is it to get those? How many bloggers will say “I found this great site called widgetworld that sells widgets� A webmaster who is starting out will never rank for anything if, as it now seems, only links within a paragraph of text have any value at all. This also seems to mean that directory links are practically worthless as well (unless it’s from DMOZ, which is a topic in itself). I have an axe to grind here because I set up a content driven site in early 2004, covering an obscure business topic. I worked hard at building content, hoping that someday it might be considered an authority or hub site. It’s nothing to do with mortgages, real estate, online pharmacy, or any of the other spammer favourites. The homepage is currently down about 700 places. I don’t know if it will make its way back up or not at this stage. I do think that the Google algo needs to be adjusted in some way to cut some slack for what I would call ‘labour of love’ sites. You know the sort of thing: niche site, webmaster frequently adding content, tweaking design, posting blog entries etc. etc. Google’s engineers are much smarter than I am, no doubt they could figure a way to do it. This is important because it seems that sites subject to regular changes have been hit heavily by this update, without reference to whether these changes are simply improvements to content or layout. The current situation, where adjustments seem to be applied with the all the precision of carpet bombing, is doing nothing to endear Google to the wider web community. I suggest that Google develop a feature which would be openly available where the issues which most affect ranking for each site would be displayed on a graph. The graph could show in a simple layout the positive and negative factors influencing rank. This would bring the whole issue of what is or is not affecting rank out into the open, and would serve as a way of undermining sellers of Black Hat SEO. Such a move would also help to resolve the endless speculation in the webmaster community about what does/does not influence ranking. Now I am aware that Google's algorithms use massive quantities of data and calculations to influence SERP's. I'm not suggesting that every single influencing factor be included in a graph. However, the principal issues could be graphed, such as: - Quantity of inbound links - Quality of inbound links - Separate category for Google's measurement of non-natural links - Quality of outgoing links (any bad neighbourhood links?) - Degree of content 'uniqueness' - Quality of layout (use of headings, alt text for images, etc.) - Age of site ...and so on. Each factor could be shown as influencing ranking in a positive or negative fashion so that webmasters have some idea what to fix. The concept of 'Trust Rank' has been mentioned in a few places around the web, an overall assessment of how 'dependable' a site is from a quality point of view. I don't know if Google use this or not. The Google Graph idea is, if you like, a sort of visual representation of trust. Unlike the visible Page Rank indicator on the Google toolbar, the Google Graph would show the factors influencing the result rather than just the result itself. I've looked into the 'Trust Rank' issue a little, and it is suggested that a site can improve its ranking by being approved by certain third party agencies (e.g. TrustE or Verisign for commerce sites, and something like ValidatedSite.com for content sites). I'd gladly pay for the relevant assessment, but I just don't know if it would actually make any difference. In the various discussion forums at the moment there are many cases where sites with similar characteristics are being treated completely differently. This inconsistency is adding to the frustration of webmasters. My opinion as a webmaster is that a public display of ranking factors is the only way to resolve the many uncertainties, and restore the confidence of webmasters who do not use underhand tactics to gain SERP position. White Hat webmasters are getting a raw deal at the moment. It’s time to put it right.