What are people who have terrible sites but are still selling links doing to tackle Google Panda and vice versa, what does Google Panda do to tackle awful websites?
The main aim of Google panda is remove or bring down the sites PR that has awful links and are not doing anything to improve them.
Google is now looking at how sticky a site is. I think this is a very clever way of getting rid of the trash. If people click in and out quickly the site is not relevant. They are also looking at what people do on the site eg Like etc and this gives Google an idea of the viewers expereince on the site.
It is making SEO tougher, but I still believe in sticking to the basics like quality backlinks, good keywords and keyword phrases, and quality updated content.
Panda is made to prevent the spam. They want to remove the duplicate content,thats why google made the update panda.
Well, Google Panda is quite efficient in finding shallow websites... Since, its an algorithm, there are ways to fool it. This is why it, sometimes, misses a few websites. This blog post is describing Panda after effects in a very good way. http://www.keywordcountry.com/blog/...nkings-changed-surviving-google-panda-update/ This Algorithm, simply pushes the low quality websites down.
Google started manually reviewing the high PR and high SERp sites only. its totally impossible to observe / diagnosis all those sites available online. this can be the only reason why the worst sites with less PR and SERP is still alright
Panda is bringing down the PR of the sites that are awful. It is also trying to remove duplicate content.
In the recent updates of Google Panda on November 9th most of the sites have suffered due to the algo tweeks. This is the main reason that these updates are made due to which the spammy sites and sites with doubtful looking content and links fall back in the SERPs.