I usually build lots of affilate sites and they tend to go suplemental and get penalized etc and I have never really cared but recently I have actually tooks sometime to make some real sites, write content etc and I have to say they supplemental index is total bullshit I wrote like 50 pages of unique content and built this pretty good site in imho it indexed in and was getting a little traffic, its mostly about obscure boating stuff , so there is just niche traffic I add a bunch of pages of maps , granted these pages are of images of maps, not real use to SEs but very useful to people who visit the site Google puts all those map pages in the supplemental index, but wait doesn't stop there , drops all the content pages except for like 5 pages So much for that white hat bullshit about building sites for visitors instead SE What do you have to do robots.txt everything that might not live up to googles standards? I can see dropping the pages of maps, they have next to no content etc, but why punish the rest of the site, the maps and info are relevant and make the site a much better site? is it just me or is this whole thing a big crock? Even better I mispelled or in the title ... I should just go lie down or something
I have figured one way to fight supplemental results is to have unique title, meta tags like keywords and description. If you have these unique and content is also unique, there is no way you could end up in supplemental results. This is my personal experience, thought I would share!!
At least title, meta tags, should be unique to pages and content should be unique to the world. By unique content to the world, I mean it should not be an exact copy of some other website page!
well since the pages are maps that are mostly just identified by the latitude and longitude, it would be hard to make them very unique I just disallowed them thru the robots.txt anyone have any guesses on how long it will take to remove them I deleted all the supplementals on another site like around 3 months ago and they are still all there in the google index
I've seen supplementals on there that are over a year old, sometimes the page is 404 and it will still be there.
This is a subject close to my heart. I had 'supplemental' sites and got all but one out by using 'normal' methods - unique metas etc. And for the majority of sites it is true that if you have unique metas, content etc and enough inbound links, preferably with some deep links, you can avoid supplemental results. However I strongly believe, based on comments in DP over the last few months and on my own experience (with a site where I wrote all the content myself and which has been in supplemental since new) that there is also a flaw in the google system somewhere, a crack that somehow puts certain sites, incorrectly, into supplemental results, regardless of whatever you do.
I don't know about ferret77 but my issue is definitely not with page size. Many of my problem pages are 'too big' (several screens of text) and should ideally be split out into several different pages. I haven't bothered because adding more supplemental pages isn't a very inspiring way to pass the day.
I've had supplemental issues for a number of my sites for quite some time now.. On my main site, every single page is 100% unique, has unique titles and metas, and still a good 25% of the pages are going supplemental. Asking for deep links to those indivual pages is unreasonable, as we're talking about hundreds and hundreds of pages that have been affected. All of the pages are interlinked through the site itself, and I can go ahead and get a few outside deep links, but I don't think that's going to fix the problem. I see other competitors that have similar link structures and only have links coming from their own site, and they don't have the supplemental issues that have been plaguing us. It's really a mystery what's going on, but it's just killed our traffic for the past couple of months, and really affecting our bottom line!
I've seen plenty of sites with 100% unique and original content, unique HTML title tags, unique meta tags and inbound and outbound links fall into supplemental results. Personally, I think Google's algorithm frequently screws up and tosses a site into supplemental by mistake. I have a site that started going supplemental in May due to an Amazon store in a subdirectory. I removed the store and blocked the directory using the robots.txt, but there are still over 10,000 obsolete URLs in Google's index, all in supplemental. I don't know why Google would want to keep obsolete pages that are generating a 404 status code in supplemental. I've seen that with quite a few sites.
That's a tricky question. 90% of supplemental sites can be mended or belong in supplementals. But those sites that have 'fallen in the gap' yes, I think a new domain is best. I haven't done that with my own site, because after months of link-building etc it's all a bit depressing to move to a new domain. But I should have done it months ago. I did put a rewritten variant of part of the content on a different site a couple of months ago and that was all indexed fine and is now ranking well.
my pages are in and out I can't keep up with it last week 50 pages in this week 85 week before all but 1, all unique meta and content, wait and see is my advice , my traffic is really fluxing.
yeah this really blows I took the time to get all related links on this site, which is pain in the ass imho I was trying to measure the difference between x amount of related links to x amount of unrelated to see if it actually made a difference