I've applied a tool to watch the link popularity and reported that Pages in Google Supplemental Index 89 Pages in Google Main Index 0 I didn't get what google supplemental index mean?
It is the amount of indexed pages from your site that are 'supplemental results' -> http://www.seobook.com/archives/001545.shtml
A few pointers to avoid. same <title> on multiple pages same <meta keywords same <meta description not enough different copy on multiple pages This is not anything new or major, but it still surprises me at how many people still do this. Get a few deep links too, find some deep linking directories and put a few links to various pages within the site.
when u say not the same meta description for example, will it count as the same if you only replace 2-3 words? like the only the name changes.
Something you saw someone post on a forum...... I tested for myself and changing titles and metas does not get your pages out of supplemental....Deep links are the only way to do it
Start working on getting backlinks for the site and less time worry about a link popularity tool. You do not have to worry about your meta tags at all, google wants to see your site have relevance on the internet. Only way to achieve that is get more backlinks.
the main criteria for the supp. results is page . you must build backlink to avoid from supplemental results
I agree - doing it in the moment; changing or having unique content and meta tags doesn't get you out! Try posting articles and dirs. with deep links to any supp. pages you have!
Absolutely, deep links will help get you out, but remember that the same titles or meta tags and not enough unique content on several pages will help put you in there, in the first place. A link is an indicator to Google of the pages quality and therefore can help pop a page out of Supplemental. If the pages have too similar content etc, then it's possible for another page can drop into Supplemental as Google now think, "oh, this one with the link is the good content, lets put this other page that pretty much looks the same into Supplemental instead!" Get a deep link, AND make sure for unique content / title / meta tags. If pages tend towards Supplemental for a site, it's probably time to rethink the navigation and internal interlinking as being in the supplemental index is an indicator that your pages do not have enough page rank to share around. "Why I Love the Google's Supplemental Index" http://www.seobook.com/archives/002201.shtml Matt Cutts http://www.mattcutts.com/blog/google-hell/ SEOMoz - Whiteboard Friday - "Supplementary My Dear Watson" http://www.seomoz.org/blog/whiteboard-friday-supplementary-my-dear-watson
No, none of that is true. Duplicate content has absolutely nothing to do with the supplemental index. -Michael
Google have recently turned off the check to find the Supplemental pages site:www.site.com *** -lkjpoi Code (markup): Unfortunately that doesn't work right now. Hopefully Google will extend the functionality to webmaster central. You can still see which pages are NOT Supplemental like so: site:www.site.com/* Code (markup):
As it happens, I never said anything about "duplicate content". Your pages in the Supplemental index are those which Google has some issue with; that issue could be it isn't being passed enough internal link love, hence my further comment suggesting to rethink the navigation and internal interlinking, but too much similar content in titles and meta tags can also give Google a problem as it doesn't know which is the correct page to show for a search, it selects one of the options and the other can become Supplemental. Furthermore Aaron Wall of SEOBook says
Okay, I was being pedantic. I believe that several factors are at influence, with the strongest being any external links in, then internal page rank distribution and then the uniqueness of each page. With all else being equal (no external links) why would you say some pages on a blog go supplemental and others not when pages get the same internal link love from home page / archive sections? Also, having similar content is a problem for search engines knowing which page to serve and shoving some off to the Supplemental seems too neat a solution not to use for me!
Be that as it may, it is not the case. It doesn't matter how much it makes sense. You strongly believe it because it has been reinforced in your head time and time again. However, it has always been a misconception, and one that Google themselves confirmed recently. No two pages have exactly the same link strength. There are multiple factors that go into it, and it can actually vary from one day to the next. Google has a perfectly fine means of dealing with duplicate content. It's a duplicate content, or similar pages, filter... but it applied at the time of searching. It has to do with whether the &filter=0 parameter exists in the querystring or not. -Michael
i totally agree with liamvictor that the first parameter to get into supplemental is purely based on duplicate text, title, keywords etc. Links has nothing to do with supplemental pages. Link however MIGHT help you get out of supplemetal after fixing up the errors. i again re-iterate Backlinks has nothing to do with supplemental. Oh gosh.
Why do you continue to spout stuff you have no clue about? I'm not trying to be a dick here, but you keep doing it... Direct quote from Google: http://googlewebmastercentral.blogspot.com/2007/06/duplicate-content-summit-at-smx.html The only way it is even a problem is if you have too many pages, causing you to have lower PageRank on each page, but in such a case it does not matter if they are exact duplicates or completely different. Please stop giving bad advice and spreading misinformation. Oh gosh. -Michael