Is it possible that Google runs two different Algos? I have been noticing for a long time that things vary to a large degree between data centers. So much so, that the difference do not look to be a lack of current data, or an older version of an algo. I am wondering if Google runs two differeing algos on various data centers just to mess with SEO's and keep them on their feet? This would certainly explain the odd instances of an entire site being supplimental, vs., another not. Thoughts?
My belief is getting stronger day by day that Google is now planning to kill over-seo thing. And all these vanky things are part of its strategy to shack minds of seo ppl. Google is becoming more conventional and adopting preferences of offline life.
My belief is that Google needs to stop thinking "Offline" and think "Online". I also believe if Google had competition we would not see Google 9/11 with sites falling as Google would need to think how to keep there index clean without wipping out clean / white-hat sites. Google can go to the offline ad world but they should keep thinking "Online ads" with anything to deal with the search engine.
It does make sence. Mong, yes, I've heard SEO complaining that Google develop new algorithms to fight SEO.
The biggest problem now is the collateral damage, e.g. all the legit sites getting killed in the wake. What a PITA.
I think google runs at least two different algos. One seems to go for more onsite characteristics; other is more offsite (inbound links). I think its pretty smart of them to do it. The results suck sometimes, but it makes it harder to pin down exactly what to do to rank high for everything. After your site reaches a certain age and a certain number of (good) links; ranking high in google is a piece of cake - they had to do something.