I have this small problem: Whenever I write a post on my blog, google indexes it fairly quickly. However it seems that google throws away another of my pages.. This means, I always have (close too) the same amount of indexed pages. What would be a good way to get more pages indexed? (I am using the "related posts" plugin, so I do link to inner pages from all my posts.)
the best way is link your non indexed pages with indexed pages. this will help crawler to know about you non-indexed pages and moreover it will make your linking strong. write frequent posts.....so that crawlers can crawl them fast.
I am already using a sitemap, and submitted it to several SEs. I know the importancy of linking from other posts. I was hoping the "related posts" plugin, was enough, but maybe it is to low on the page to be given enough credits. would it be a good idea to update my posts, with some links in the content, to other posts? I value your ideas
may be you are using the wordpress or similar to it. Your main page got index when you post new article it post the main page so it got index and your last article on the page went to next page which not indexed.
There are many ways to get indexed ... among them some are article submission, forum participation, social bookmarking, social networking.. for sure the traffic will be diverted.
Very true. My traffic does go to various posts, but I am sure I am missing out on a lot of SE traffic. I can see from the replies, that I am on the right path. Maybe time and some patience will do the trick
What changes have you made to the site recently? It could be affecting the way Googlebot crawls your site. For example, if you have accidentally put rel=nofollow to navigation menu , noindex to some pages or block URLs in robots.txt then this will affect the crawling of your site. Anyway, one of the things you should NOT do is to avoid duplicate content, Googlebot crawling frequency and how deep they will crawl depends on content quality and uniqueness. In my experience, I have work with 30,000 URLs website , with excellent sitemap, excellent web development team and with links pointing to it. However Google will not indexed those pages because it has been found out that those URLs are highly similar around 75% above. What we suggested is to add more content, like unique product descriptions at least 100 to 500 words. The result is that, Google starts crawling those URLs which are not even crawled before. I read several sources and even from them (Google), they said that they discourage duplicate content especially for blog and other sites because when Googlebot crawls the site, it will tend to consume a large bandwidth just to crawl those pages. It is why they tend to priorize the crawling to important and pages with substantial content in order to save some bandwidth in the site and increase crawling efficiency in their part.
Ok, let your site 123dotcom submit some page of your sub pages like 123dotcom/345.html, 123dotcom/567.html on some social bookmarking site. see if spider crawl your pages. if not post some links on your current blogs.