I agree - beyond coop or link vault. This update seems to be related to linking based on what I am seeing for my sites.
I don't think so. I have 2 sites that have only links from link vault and they are now ranking top 40 (1 was not top 300 before this shakeup, other was).
Well, who knows with Google? But again from what I can tell from MY sites, there seems to be a relationship to the amount of links and drop in rankings. I know that not everyone will see this correlation though. But I think I am leaning towards Caryl's pre-scoring theory.
The above the fold/below the fold postulation is bordering on ridiculous. Firstly, CSS makes determining where something on the screen is very difficult. Unless Google renders every page in every common browser and version when they do the crawl. Which I doubt very much. Secondly, people use different monitor resolutions so what's above the fold for me could be well below the fold for you.
That may be true, but what about surrounding text of the link in order to determine relevancy? I don't really think there's any sort of penalty involved here, I think it's more along the lines of relevancy of where the link is placed in relation to the body of text. Just a list of links to unrelated sites won't hold as much weight as say a paragraph about charities with an embedded link and targeted anchor text. Just my .02.
Just out of curiosity, how can you use CSS to make the links appear at the top even though they are at the bottom?
Not sure I understand your bottom line point . Everything you say about CSS is 100% true and I agree. But are you saying that Google has no way of knowing/ determining what is prime space on a monitor? What I am now starting to see makes sense to me, even though my rankings went from #7 to #40. Generally mapping out the real estate on monitors in terms of importance (Prime) actually makes sense, IMO. Information/ links that appear in these areas with relevant text MAY be deemed more important to users and Google. I am sure there is more to it. Might this be an added, new way for Google to assign 'the score' to a webpage...the position (location) of links on a page?
The search engines still cannot really tell when a page is using hidden text. I dont believe it can reliably workout which info is above the fold or quality real estate. A powerbook, pc, pda, phone, web tv, laptop, tablet pc, projector, 10", 12", 14" 15" 21" 30" 40", multiscreen/ monochrome display, browsers, resolutions, screen size, background images, CSS, HTML formatting, cloaking, language, geotargetting and probably a whole load more reasons that make it difficult to determine what a user is seeing.. So, based on that it wouldnt be wise to implement something based on screen positioning.
Google knows about the heat map. However, if I run at 1280x1024 resolution and you run at 800x600 then the 'fold' is in a vastly different point on my monitor when compared with yours.
I see my page returning to its old position. I get visitors from google since 7.50 pm (Time of Germany). Last two days, I had zero visitors from google. Anyone here seeing the same ?
Sorry to get off topic from the ranking experiment, but it's bound to happen in a thread this long. I'll keep it quick. It's not about links above the fold, below the fold, screen resolution or anything in that arena. We know that surrounding text matters. We know MSN (I think) did a whole bunch of research on Block Level Analysis, to pick out what navigation looks like, what footers look like, and what content looks like, regardless of tables or css manipulation. 5 outbound links lined up next to each other are pretty easy to spot. As far as I know, content higher up in the code gets scored higher than content lower in the code. But for scoring link importance, I've got to believe the engines are using methods much more sophisticated.
Absolutely that's the theory that I fully support in the latest Google update!! You may wish to read this . In fact a while ago a person posted an in depth opinion of how Google can read Java script (old news I know), I wish I could find the post. Most of the feedback from other DPers was very negative...no WAY, are you nuts, etc? He went on to say CSS is also being considered by Google engineers. I think it would be silly to think that CSS is the answer to manipulation, IMO. To me, this is the most clever update I have seen from Google, it really seems to leave the entire SEO community up in arms . Many people are suggesting that we should sit tight and watch, personally I am more proactive and am already taking steps to (hopefully) gain my good SE postioning back.
These updates are always the same. People try and figure out what Google likes now and quickly change their sites even before the entire update is rolled out (which is going to be in a few waves). Any short term changes you make now probably won't be seen for weeks, and you won't even be 100% sure your changes caused your increase in SERPS because there are 1-2 more waves of updates coming in the next few weeks. Honestly, I'd really sit tight and not make any on-page modifications. I also don't 100% agree with the post you referenced at SEOChat, especially on his 5-10 links a week coment to prevent sandbox penalties. I do agree, however, with his theory on links and have them imbedded in an article or something along those lines for more "worth" if you will.
I was just wondering how this test was going, so I looked it up. For the keyword 'charity', this site is now in the 80 to 90 range. Fell below 800 during jagger, so it's up from then. Not as high as I would have expected pre-jagger. There are sites with 0 pr and no backlinks (as reported by google) that are higher (example: http://www.apple.com/hotnews/articles/2005/12/charitychecks/ is currently ranked 39) I just don't know what google is doing anymore. Things seem very inconsistant.
Backlinks and PR are updated periodically, so they wont show on new results and PR is not a measure of a site's ranking/ rankability.