Why is there so much flactuation in the number of pages that indexed by google. It decreases and then gets back to norma,l then decreases again, and so on.
Today, around 30% lower. I am just hoping it has gone down before adding some of the new content and of course reincluding those deindexed pages as well.
Same here a lot of flactuation but the good thing that the traffic was n t affected. So if your traffic still not affected don t worry about it.
I usually find it depends on the size of your website, with a smaller site there's less fluctuation between datacenters, whereas with a much larger site you'll generally see a bigger fluctuation as the data is pushed out at different times. I.e. One datacenter may have indexed your new pages & be showing 2,500 but another datacenter may not have updated yet & might show 2,300 pages.
Of course it will be affected. If you suddenly get a drop of 30% in the number of pages indexed then you are bound to get less traffic from google because those pages will not be included in the search, isnt that correct?
Like I said it depends on the datacenters, there may be a 30% drop in indexed pages on your datacenter but the others may still have the full amount indexed. So you'd not really notice a drop in traffic in that case. If it was a 30% drop in pages indexed right across the board then you'd definitely notice a drop in traffic. I usually use http://www.yourcache.com to check across datacenters. There's other sites out there too that can monitor the amount of pages you have indexed.....
My sites have big variance accross datacenter. Some are by 50% difference. But i dont notice any change in traffic. Im also tracking a competitor site and he/she lost some 50% of pages and it is accross all datacenter. But one thing need to be said, once your site exceed 1000 pages, you cant know how many more are they after the first 1000. The result that google show is only an estimate NOT an exact value. Thats why google says 'Results 1 - 10 of about'. I have one site that when i search, shows more pages that it actually has. Google shows about 57% more.
I am seeing a drop right across datacentres. I have seen this before. It is usually quite temporary before more pages than prior to de-indexing are indexed i.e. the new pages of content are also added. What are the exact reasons for de-indexing right across data centres?
Can you tell me how many pages do you have ineded ? I was having 15k pages indexed and it used to fluctuate by 30% everynow and than. It's kinda difficult to say that whether the page gets driped from index(I don't think so) or is not even considered in serps(again I don't think so). May be it's just that the pages remain in the index and its just that google datacenters shows some wrong data. As long as serps are not dropping, I think our pages are there.
I really don't rely on google for these things. They are notoriously unreliable. If your visitor numbers drop, that's a good sign there are indexing problems. If a section of your site is not indexed, try searching for a phrase you know is on that page, to be sure the page(s) are not indexed. If not, it's time for renewed (quality) link building and perhaps a redesign that fixes problems with internal linkage (are all pages within 3 links of the main page?). If there is no consistency in the pages that are being dropped, don't worry about it too much.
Is there a formula (i doubt it) for having all the pages indexed in google? Are there people here taht have pretty much all their pages indexed in google? If so, how have you been able to maintain this.
Yes , you can use Google's Webmaster Tools to get full list of indexed pages and other very important stats about your site .
I've got pretty much all pages indexed by google. Some pages aren't though. The ones that are not indexed are (at the moment) - not unique (an older site has these articles too) - very short quotes Recently I also didn't see my newsletter indexed. This bothered me so I made the links on my sitemap one level deeper for the newsletter. Some metrics that may have something to do with this. - old site (2001 domain) - old links to the site (DMOZ since 2001) - lots of wikipedia links to the site - mostly unique content (like 95%) - some edu links to internal pages (not many, but a few) - have been adding a minimum of 1 article a week for the last few months. Not sure whether this has anything to do with it though, because before that I didn't add anything quite as often and still the whole site was indexed. - no sitewides to the site - very few sold links on the site - no links to the site were ever bought - a few reciprocal links where it made sense - have been link-building (in the sense of asking people to link to my site) since I started. - gets stumbled a lot (in 2006, not before that) All in all - I don't think it's easy to replicate with a new site.
The point is that the list of indexed pages isn't actually correct. Just like the list of links to your site isn't actually complete. The webmaster tools are useful for finding out what errors there are in your site and what keywords are associated with your site and what keywords your site is found for. But it is not very useful for the kind of question people are asking here - like: how many pages are indexed.