Yep, that pretty much sums the whole thing up nicely. Like I said, it is up to Google to to change what they like. They can do whatever they want with their own engine - it belongs to them. Build a good site, get traffic from a variety of sources so that you minimize the effect of these updates. Not trying to be anti-Google - just saying what I see. You're entitled to an opinion just as much as I am.
Were I a betting girl I would put my money on a multi stage process: Step 1 - clear the SERP spam by bakcing the index up to the palmy days of 2000 when SEO and SE spam barely existed. Step 2 - using a relatively sophisticated set of filters let newer sites into the index. (As part of this pay attention to anchor text, natural links, non-reciprical links.) Keep the rest of the sites in a modified sandbox at say, less than top 100. Step 3 - As the newer sites are realeased pay attention to obvious spam red flags - hidden text, blogspot address, link surges - and kick flagged sites back into sandbox. Essentially Google is attempting to sort the net and, in the process, get a handle on spam. Step 4 - when you have your index cleaned up re-order first the top 100 and then the top 1000 sites for particular keywords weighing freshness. Or, hey, it could be that Google just threw in a few random "Like old sites" variables...But I doubt it.
I am pretty sure this will be referred to as the "HERE" update in the future. See it seems that Google wants to get back to its legitimate roots and really value behavior that has taken place by joe schmo webmaster. Not some SEO'd Rank Checking Freak like so many of us are. So Google is giving huge weight to all links with the anchor text of Here Click Here is a distant second in terms of power, but Here is Here to stay. See it actually shows that you are communicating a link as part of providing information. Like: Now as to how it determines relevancy and things like that, it is supposedly top-secret. If you don't believe me, Check this out Here or Click Here Here Here!
That'd be odd. Google changed the way the internet was browsed when it gave weight to "clickable names" such as miserable failure, and actual words. Going back to "click here" or some such trash is not logical. It's a step back, not forward. Besides, its not accurate at all, nor what is happening.
It says right in the algo patent that the 'freshness' of links is on a decaying timeline. I think this is, at the moment, playing a big factor. Hence why you see pages with sitewide footer links doing well. It appears as though generating lots of new links counts and it doesn't really matter where they're coming from so long as it's family friendly.
I asked this question in another thread. Can you post up where it says that? I read all over the place where old links are best, but I'm wondering if links work on a curve...they suck for a while, then they are good, then after a while, they max out, then they decline again (a bell curve)...so you need to keep acquiring links. Resting on links made in 2003 or 2004 will get you nowhere.
The rate at which your site aquires links will be king. If you think about it its one of the hardest things to fake. A good site will continually increase in links naturally. A poor/irrelevant site with an agressive SEO will try to get links from everywhere, but with just one person behind the wheel the rate that this one person can get links to his/her site will always be around the same. The rate that a good quality site gets links should increse over time as more-and-more people find it relevant thus linking to it. I think this is suggested by the data we have all seen, where sites with a lot of backlinks just don't seem to be doing as well. As others have commented "freshness" matters. And I believe this poorly described "freshness" has to deal with the ability of a site to generate an increasing rate of backlinks over time.