'ello everyone! d I have been contacted someone that is seeing a huge and rapid drop in indexed pages. We are talking around 250'000 to 50'000 pages indexed. They changed server a while ago, and they are wondering is this is a contributing factor... Other factors may be: * content - lack of it! -mainly product codes etc... but always been this way * two other sites have VERY similar content... but this has always been so * not very many backlinks - again - always been so! what HAS changed. * they duplicated their whole site a while ago (argh!) - removed now. * At some point they were advised to put noindex tags on around 1000 catagory pages ... Urgh!! - I changed that straight away! * Server - moved to a different host. It seems obvious that lack of backlinks, duplicate content, and no index tag may be contributing!!! - but they are asking why sudden drop off after a few years of indexed pages! I tried to explain that maybe an algo change is 'cleaning house' - but there is def. a marked decrease within the last few weeks... Any1 got any ideas above or beyond what i have mentioned already? Thanks! Mike
It could be just nothing and will shoot back up in time. I have a site that roughly has 200,000 pages indexed in Google on average. Think it was 2 times in the last 2 years it shot down to about 50,000 but after a week or 2 it went back up. I would give it time and not change anything to see if it comes back first before changing stuff and making it worse.
I am with Aaron on this...give it a little time. If it still is a concern, bring it back up. Several of the things you talked about 'could' cause it, but the more probable issue is just normal index flux. Eric
The site is about 5 years old. its one hell of a drop, and has been more like a steep slope downwards... from 250k steadily down every few days, down to about 50k!
Now, did you check the indexed web pages on just one data center? I'll bet if you use another data center you'll see the normal amount of indexed pages. This is not uncommon, what is Google Webmasters Central reporting about this site? Is there an XML sitemap that is constantly updated? Was there any un-noticed downtime that could of effected the crawler during a update? I'm willing to go with this is a temporary issue, as long as the organic search traffic volume hasn't changed there isn't nothing to worry about.
yer, thats the problem. I checked multi data centres, and the guy is seeing a lot less traffic also. I am wondering if its an issue with lack of unique content, and maybe google has made a recent algo change that filters this? Mike
the problem is that we never know whether its google's normal indexing and shedding cycles, or a new trend. i would agree with the others and say sit tight for a little while. basically if you have a quarter of a million very similar pages, both to each other and to other sites as well, and very little unique content, I think that even if not this time, things are liable to get harder as time goes by. most of these pages would have very likely been supplemental anyway wouldn't they? 50k pages is still a lot of ranking power if the its the best 50k of the site and it works well.
noindex on categories? and duplicate content? Well not much info but from I would definitively bet on those two points.
You said they duplicated their whole site awhile ago? If I understood that correctly yeah that could cause pages to drop out for a little while I'm pretty sure.
What do people think about google sitemaps? - opinions as to this being partof the problem, rater than part of the solution?