I recently noticed that Google has dropped many of my pages. I wonder if this is related to the supplemental problem you are discussing. My site is still indexed but as of the last update my indexed pages have dropped from near 60,000 via site command to around 400. The API previously reported around 4000 pages and now reports 350. This has really screwed up my co-op weight too.
I'm guessing that duplicate content filter has gotten much tighter, though I am not so sure if it is actually working as intended. Drop in # of indexed pages is probably a result of that. What I'm wondering about now is whether Google will make any change in its algo any time soon or not. In other words, I'm wondering whether dropped pages and sites are gone forever or they will come back any time soon ...
I keep checking site:mydomain and I still have a ton of supplemental. It's very frustrating and since this is an ecommerce site, very expensive. The first 4 pages of site:mydomain seem to be fine after that nearly every listing is supplemental. I haven't started making any changes yet because the pages ranked really well prior to BD. How long do we wait? I did email google guy about it but have not heard anything.
It looks like my SERPs are coming back but i still have a very small number of indexed pages on some DCs
I'm trying to be patient but the longer it takes the more money is costs us. It's painful. Google is crawling the site. Google sitemap has been submitted and is checked & updated daily. There is also a regular sitemap to help google. Yet there is still way too many supplemental listings and many pages not even indexed that were indexed before BD or jagger2 or whatever you want to call it.
Wow... I am surprised that some still see the supp pages. I have not only recovered, but actually gained more index pages in the process for the past few days. Hopfully things get better for all
I think some things are starting to change. I have been doing site:mydomain and checking it daily. Usually about the 5th page I start seeing supplemental results. Today I didn't see any until the 6th page and it only showed a few supplemental results. Now the other strange thing is when I click the supplemental cache it shows an error. It says your search xyxyxyxyxyxyxyx:domainname did not match any documents. Oh and the cache date is from Aug of 2005. Click it a few more time and the cached page shows up. Something must be happening. I hope.
I have a large dynamically generated site that went 99% supplemental. And stayed there. Still ranks good for a lot of terms, probably because all the competition uses the same site script. lol
I've got a couple of smallish sites that are now 99% supplemental, ie. every page except the homepage. I tried emailing them through the address Googleguy supplied, but got a message to say that i is no longer monitored. So where can you get in touch with them?
I would try a reinclusion request at: http://www.google.com/support/bin/request.py 1) check I'm a webmaster inquiring about my website 2) check Why my site disappeared from the search results or dropped in ranking 3) Subject "Reinclusion Request - StillinSupplemental" In the message area clearly explain what you know. 4) Once they send you an automated response - REPLY to this with the same clear message.
The last days I have been stuck with ~237 supplemental results (out of original ~70000 results). Even in the Local a lot of listings have disappeared to the back ( that´s if they were still there ... ) Last chache from 21 Feb 2006. As for traffic for the particular site: Wish I would get any from G*
I dunno if it is a problem with being penalized. The site in question used to be on a really bad CMS that was practically unspiderable. I change the whole site to use static urls with the following structure: http://www.example.com/category/title.php. Googlebot has now spidered the entire new site structure. All the old dynamic pages addresses now return a 404 as they have been removed, but these are still showing up in site:www.example.com as being Supplemental while none of the new pages have been added to the index even after a few weeks. I have now submitted a sitemap for the new pages. The Supplemental pages are still there though, even after a few weeks. Any suggestions would be really appreciated. If you want the url to compare and contrast, let me know in a PM.
I've dropped massively, and have tons of supp results. But my content is all pretty similar, generated pages onto a template. Is it worth emailing GG? Some of my pages probably should be supplemental. =) Anyone else email him and get a response?
All the Supplementals from my site are now 404's but used to be dynamic template driven pages with little unique content. The new pages which have been spidered but not included in the index are also template driven, but instead have mod_rewritten urls. Maybe worth hanging round on his blog (www.mattcutts.com/blog) or on one of the forums he posts on regularly such as forums.searchenginewatch.com or www.webmasterworld.com.
Someone dropped a comment on Matt's blog mentioning this datacenter: http://64.233.185.104/ Seems that supplemental results are mostly gone there. When I do a site:www.example.com query, no supps appear, but if I add a keyword before (i.e. "keyword site:www.example.com"), then the supps are back.
I hope that is not the final solution. The "site:mydomain.com" now gets even worse. Even some sites of mine not hit by supplemental issues are down several hundred pages. Christoph
Yep ... there's a very distinct difference on that DC. For example, site:www.example.com returns absolutely no Supplemental pages, which is interesting. Given the amount of spidering going on right now I would guess that this issue is being worked on by somebody at the Plex. If you have a site that has now gone mostly Supplemental, are you still getting spidered?
I have had several sites deleted rom their index with all these crappy updates that have been going on for months! To me it seems that Google has screwed up what they had and doesn't know at least to get back to where they were before BigDaddy. Submitting sitemaps? Doesn't help and might have one of the reasons for all the mixup. Emailing Google about re-inclusion? It's like spitting in the wind..... To me Google is a giant that is about to fall!! Thank God for MSN!!
Oh yea, like 40-60k pages per day. I hope that's a good omen, at one time I had 900K pages indexed. Yesterday I had about 20k. Api went from 60k to 1k. That is insane, and sucks greatly. Edit : Some of my pages deserve it, *I* think they are all useful, but I could see how google might disagree and want a lot of them supplemental, and that's cool, I just wish at least a lot of them were indexed.
I understand your frustration MikeSwede. But traffic is traffic and we are the slaves, not the masters. Given this, reinclusion does work if your persistent and there was an actual mistake (either on your part with spam/hidden text,etc or their part). I recommend to you and anyone else experiencing difficulties to fill out a reinclusion request, once you get the automated email. Reply to that. Then after 7 calander days - reply again, and again and again every 7 days. It will work if your persistent. I realize this is neither ideal or enjoyable, but it should work. At least it did for me.