Hi, I have been thinking of many ways through which to minimize the supplemental results on my sites. I have been thinking if I apply 1 or more RSS feeds which all be configured to leech different news feeds, would this by any chance do the trick? Nik
I find all my feeds were going supplemental and was advised to use a robots.txt file to exclude the feeds and reduce the supplemental pages.
damnit.....there has to be some kind of way to overcome this problem.... If anyone can help....I promise big green reps
Do you mean post the news feeds into your content to make it less "samey" and therefore reduce the duplicate content level? That would probably help. Also make sure you have the basic SEO done - unique titles, meta tags, headings. As supplemental pages are revisited less by the robots you may need to throw them a helping hand or two to get the content updated - add a link in from a well indexed page and ideally from another site.
they can be supplemental results because te main criteria for the supplemental results is pr . if you dont have strong backlinks, every page on your site can be supplemental results
trichnosis is right. Consider supplemental results as pages that Google has found but can't give a high enough score to want to go to the bother of recrawling frequently or even presenting in the main SERPs. Getting a link in is an indicator of quality to Google and will help. If links are proving hard to find, then consider reworking all the site-wide links (designer's credits, blog roll, etc) on your site to reduce the outgoing PR. Making eachpage more unique within the site (as mentioned above) will also help.
My site is PR5...and has only 100 inner pages which are in 2 levels deep. Most of those pages have PR. I am thinking of adding a field for RSS feed on each of those pages. In addition I will play with the meta description, keywords and titles. Place some H1 also. Just want to kick out the supp, cuz its bugging my traffic.