Sorry dont agree totally They are pages with some duplicate content. You need to change the title, desc and tags to get them out of supp content. The more indivdual pages with different content the better.
They arent just pages with duplicate content there are other factors too and its not as simple of changing the title, desc and tags to get them out - if it was that easy most us wouldnt be sitting in the supplemental index.
Do a search for supplemental results in the forum, you will get to know what it is.. Need not wait for others to guve reply here.
By design from day one suppmentary results can be avoided.... As google goes thru our site they are getting less and less. As I have stated supp results can trigger googles dup filter and this will cause you issue.
The problem I have is that I have a forum VBulletin which has numerous pages (1000s) that are supplemental. Is there a tool that allows you to create unique page titles, descriptions and tags for each dynamic page or post or thread that loads? I have some capability from within the frum admincp but doing it from there doesn't make much of a difference I have noticed. If I'm doing something wrong or if there is an easier way please advise. Thanks, TheCrescent
Turfsniffer is correct. Supplemental changed radically after Big Daddy and all the subsequent Big Daddy patches. It is definitely no longer about duplicate content.
I agree with bot of you. A site:www.mattcutts.com shows tons of supplimental results. He writes his own content (as far as we know) the pages are unique due to the amount of comments he gets, and it has a bunch of page rank.
Supplemental links are like Second Class links. for a given search supplemental links are shown after the other links.
Absolutely. I have gone through and UNIQUELY generated meta tags, titles, a lot of content... etc. I have been sitting in supp since big daddy. It's appears to be totally different now. In a lot of cases the pages with the most dupe content to other pages on my site, are the ones that are NOT supplemental. Try to figure that out. I think it means you don't have enough incoming links to support all your pages being index. But that's just a guess.
I can cite multiple examples including search results for my own sites that have pages show up in the top 10 for terms that yield 1000's of non-supplimental pages AFTER the supplimental ones.
There's some bad replies on this thread, and it's no wonder that new people to SEO get confused. Turf, and Minstrel are right, but I'd like to add that looking at site: in Google will show up supplemental results for near every website you can think of. When you search for this page in Google that is supplemental, it actually appears in the search results, and isn't highlighted as supplemental, so there's obviously something wrong with the site: command right now - this seemed to get worse around the 17th August.
There's been something wrong with the site: query since Big Daddy - one of the few things that Google has actually acknowledged.
Don’t forget you must look at duplicate url's and not just the duplicate on-page content. There are different variations of duplicate urls and can be very easily missed especially if you have a large site that has been built on a CMS like a forum or a blog. Different variations of dupe urls can be http vs https www. vs non-www. .com vs .com/index.htm and /dir/ vs /dir (without the trailing slash). There are other variations leading to the same pages but those above are the main ones to watch out for. If you havent checked dupe urls then I would suggest downloading GSiteCrawler and let he bot crawl your site. This will find your dupe urls or what the bot see's as duplicate content. I also suggest using Xenu's Link Sleuth which basically does the same trick. Those two programs will also help find any bad links on your site that you might have missed. John you couldn’t have stated a better example than Matt Cutts site - as you said he rights his own content, as far as know First Matts blog has plenty of incomming links and I would think some really valuable ones so I doubt backlinks is even an issue with this. Now, word press seems to be an issue for spewing out dupe urls - with a trailing slash and without on a url. Take a url from his supplemental results: www.mattcutts.com/blog/more-info-on-updates < notice this url has no page rank and shows as supplemental. Now try: www.mattcutts.com/blog/more-info-on-updates/ <It has page rank. I am not aware if word press has a plugin to fix the trailing slash but it would be useful if there was....I personally have gone looking for it. So what about a normal html site that has been hit with supplemental? If you have used the two mentioned tools above and everything is A-OK but you have also clean up your meta and page titles to be honest there isn’t much you can do at this point. Google is remaining very tight lipped about this either because they don’t know the answers or they just want to keep it a trade secret in their crusade against fighting spam. Now the issue with Big Daddy as minstrel mentioned, is still on-going and Google have given clues that it will be on-going until the end of September so we are all going to see our sites jump in & out of the index as if they were performing in a circus act. If you have followed the above steps on the dupe's etc I advise not to touch your site anymore as it may not be in the wrong and as the saying goes "if its not broke don’t fix it". I personally think that we will only know if our sites are totally screwed up after the end of sept and if they are still dropped down the serps after Spet then its time to hit the panic button and run for the hills. Time for a beer *edit* Darren/minstrel I am only seeing your two posts - yes the site command is totally banjaxed but I am not sure how reliable the site: command on gfe-eh.google.com is - do you?
I'm not sure if that is true. I have been using a certain method that has proved flawless for me so far. I try to tell people this all the time but no one seems to believe so here is an example of how to stay away from dup content and supplemental results. have a look at my indexed pages here if you go back a few pages you will notice that for the most part the titles are the same, however just by adding a number at the end of each title not only in the title tag, but as well as the alt tag and h1 headers and page names it avoids the dup penalty and kept these pages away from the supplemental index. Believe me this works. The same rings true for this site a directory that I am still working on.