When google/yahoo spiders visits out website (or blog/forum), is there any way to ensure that all (or maximum) pages gets indexed by them ? Is it related to the way be design of the website ? Some fundamentals ? Experts, pls eloborate, thanks !
make sure you have a sitemap if you are running a website. for blogs, specially if you are running a wordpress blog, i dont find it neccessary to have a sitemap but some use sitemaps with WP blogs as well which doesn't hurt. and make sure you have plenty of backlinks through which search engine bots can find your site and crawls on a frequent basis. more often the SE bots visit your site, better your chances are of getting the whole site indexed. submitting to top bookmarking sites can help you get indexed fast