Anyone else notice that using the site:my_domain.com function tends to give a result where Google SAYS that say 1000 pages have been indexed, but upon examination you find that only 25 result pages (245 site pages) are publicly accessible? Does this mean that Google is aware of 1000 pages of your site, but only 245 have been approved for public consumption?
I think I can tell the problem. Google (like us) has No Idea what your website URL is - so the pages are falling out the index? Now, how about you give us the website URL, then we can have a look and possibly help? Of the top of my head... 1) How/Where did Google tell you all those pages were indexed? 2) How/Where are you now being told a different number? 3) Do you have Unique/Original Content? 4) Is the site having small/low content and lots of adverts? etc. The more you can tell/give us - the more we may be able to do.
This is a standard situation in Google for years already. You can test it on any kind of search query: it will be the same (found 10...00, but browsing further you can find only 1...0). If you would be an expert in informational databases domain, you would have easily answered why this happens: counting all-found-results accurately is too expensive, therefore Google made a standard tweak: he optimized the count function such that it eats not much HW resources, gives some decent value, but with low accuracy.
Thanks. I didn't know if it was an estimate or a stated difference between pages indexed and pages published.
No. If Google has said your website has been indexed for 1,000 pages then it is. However, they have never being shown for completely and I have experienced the same. I think the deep link building could help your inner pages to get more important and being shown via site: search operator.