Hi, I have a blogger blog. Total number of post are 135. I submit my sitemap (http://www.finestmail.blogspot.com/feeds/posts/default?orderby=updated) to google. its showing total submited links=26 and indexed links=9 my question is, how to submited my all 135 links and how to get maximum of them indexed on google? Thanks in advance!
i have the same problem........sumitted sitemaps are 127 and indexed are only 14... btw i have submitted two separate sitmepap rss.xml?orderby=updated (Submitted 26, Indexed 1) rss.xml?redirect=false&start-index=1&max-results=100 (Submitted 101 , indexed 14) im making a mistake somewhere ???????anyone ? and i have more than 1000 post........why is that
I was hoping someone who knew about indexing would respond. I have 140 pages in my sitemap.xml file but only 9 indexed. I had 10 indexed at one point but then submitted a bad sitemap and it dropped to 9. Corrected the problem but still only 9 pages indexed. Grrrrr.
I'm in the exact same position as you guys. My sitemap contains a lot of posts, but only 400 has been indexed...
i have question? in webmaster it show robot with disallow command disallow for search and disallow for mediapartner google now what this mean... is it restricting to nofollow ...
try searching this on google, you will get all your indexed pages on google site:http://yourdomainname.com/
when you submit a site map you are telling Google to index these pages but its upto google to do them or not. While Google in all possibility will index them, one cannot make sure that they will be indexed, especially when there are lot of files and large sitemaps. Google has its own ideas when it comes to finding out if a page from a website is worth keeping in its normal index or not, and submitting a sitemap is not a resolution to it. Some situations when Google might not index all the files on the sitemap. 1. When there are files in the sitemap that have no external links/internal links at all . 2. When there are pages that are strikingly similar and could be considered as duplicate content. 3. Other reasons that tells Google that the pages aren’t relevant or technically correct.
Try to link to your previous posts with current posts//Use some thing like Random posts or Related posts...This will help Google to Crawl your previous posts again and index them....
Hi, Linking the individual posts with each other is good to guide Google for easy crawling. However, create a sitemap which carries all of the post links and submit it again (not only those which are updated). Then monitor the stats, how many get index by Google. This approach worked for me.
Google may not index some pages because of very similar content or exceeding the limit of pages for the site (that counts using number/quality of external links and some other factors). By the way you should not use site:site.com to count indexed pages. Use Google Webmaster Tools. Google persons explains why - google.com/support/forum/p/Webmasters/thread?tid=7ec5d241be66c155&hl=en