I have a challenge that I'm trying to solve with my websites, in regards to content. I know it is important to have lots of well written, targeted content on your site, but I have a question about structuring it so that it can all be indexed by Google. About a year ago, I built a site that is doing well, and my main articles page(which can be accessed from the menu bar, has about forty links to forty different articles. In each of these forty articles are links to other articles that are buried two clicks within the site, and are not accessible from the main navigation page. I recently found out about a month ago that 85 of my pages have not been indexed by Google, even though I published them on the web months ago. Therefore, I've come to the conclusion that articles which are buried more than two clicks into the site may not be spidered and indexed by Google. So now I've set up Google Webmaster tools and submitted a site map with all 85 pages to Google. However, Google has only indexed 14 pages from the site map and it is taking forever for it to index all 85 pages. Sometimes it shows 17 of these 85 pages indexed then it falls back to 14. I'm now building a new site, and my question is this: how can I add 250 articles to my website in a way where Google can NATURALLY index all the articles without having to rely on Google Site Maps? If I even put more than 100 links to articles on a single page they will not all be spidered. This is a challenge that I've been dealing with for years. Having lots of good content leads to lots of traffic, but it does you know good if Google doesn't index it all. So how can I add 200+ articles on my site in a manner where the Google spider can crawl and index them naturally, where I don't need to rely on GWT? Another question I have is this? How much content is enough?
I don't think so. Build sitemap xml and ping it to google every change you made. I am using wordpress and xml sitemap plugin that notice to google every update, and the result: all page indexed.
use this tool: http://blogsearch.google.com/ping just add your url of required page and ping it! It will surly work!
The way i know how to do it is to take the content of the site, wait about month of or two, than upload it again with new sitemap and new external links. Remember to take off all the links from the text itself.
So you are saying, when you submit a site map to Google, none of the articles submitted can have links in the body?
I used this for my site and it worked well! my site: http://www.sportsstatsblog.com/ Better way you try it yourself! You will get results within few hours! My experience is great. Yesterday I pinged it and within 2 hours found my article indexed! You can check my indexed pages!
If your content is original, attractive and closely updated,then it will most be indexed by Google.Try it.
This was also my problem and I cannot got them solved completely yet. It has being solved for some but this trick may help you. 1. Do link only related content between inner pages and not overdoing it. I have figured out Google does not like the way of sitewide internal linking to too many pages. Google does also not like the "Uneditorial Inner Pages Linking" too. You should consider linking only related and not put a bunch of links just because you want them to being indexed. The optimum number is not over 5 per page. 2. Build deep links - Sitemap submission can help but not enough for some case. You have to go deep links building for those certain inner pages. I would also suggest to get links from in-content instead of sitewide. It would be looking more natural.
Matt said in a recent interview that the number of pages indexed is approx. proportional to the PR. So even though PR is a bit outdated, it might be useful if you have lots of pages you need indexed. If it is a "normal" website it shouldn't be to much of a problem though, just be sure to link in a way google can follow (and use a html sitemap as well as a xml)
I have another idea. I currently have about 40 links to 40 different articles on my "articles" page, and most of them are indexed. What would happen if I took, say, 10 of the articles from this page that have already been indexed, and bury them deeper in the site while still keeping them linked to the 30 main articles, and then, when I write 10 new articles, I put them on the main articles page so the Google spider can easily find them? Do you think this would work? My only concern is that if Google cannot find the 10 articles I buried then they may remove them from the index. What do you think?
I know you're delving into more sophisticated linking strategies here, but please don't forget to ping your site whenever you make a change or add new content to it, and utilize social bookmarking as well on them. It's easy to overlook the simple things that need to be done repeatedly, but I have no doubt this will definitely show results if you add good, search-engine friendly content consistently to your site over time.
Hi I also tried the blogsearch.google.com/ping tool and it worked very well for my small website, It seemed to index all of the pages quickly for me, although as I say it was a quite small site with less then 30 pages so I guess its possible the results could be quite random for bigger sites.
IF you add quality content in your site frequently than after do bookmarking with article URL definitely you get it good result and also increase your indexing speed