hi, Can anyone help. I have a site which I'm always adding 33 sitemap.xml.gz. and one sitemap.xml. I resubmit my sitemap to google about 4 month and up until recently the 'Indexed URLs' has been going up. When I had 1,296,796 pages, google had indexed about 100,870. when i had search in google site:agentworld.com 1 mont before its show 174000 but today is show 144,000 . The crawler access is fine, the crawl errors only 14 Short meta descriptions . What am I doing wrong? Thanks
My travel guide blog has problem in Google indexing. It used to have 800 pages indexed at Google, but now it only has 160. I think the reason of this is because Google thinks I have internal duplicate content on my blog. I notice that my tag pages does not indexed well. Probably Google thinks they have almost similar content between two pages. For example http://etravelogy.com/tag/music-boxes/ and http://etravelogy.com/tag/official-languages-of-switzerland/ before I add Amazon product. The only different thing in those pages is the title tag. So I add amazon product based on the tag keyword with my own plugin.
if you get all the categories crawled by google then you are doing nothing wrong.just continue doing so , it takes time for google to crawl pages. check your robots.txt and edit it to unrestrict the google BOT
It is a wounderful site, cool and easy to navigate. do not wory very soon you shall get there. have a look at mine at http://www.hiwaar.com, and tell mw what do you think. It is still messy !!!!!!
+1. On one client's site there were such pages - when we removed them - instantly got better indexation and positions!