hi all i am facing huge problem with google. Google bot crawled my site. After 10 days all the pages of my site got cached. i added again 40 pages to my site. and linked them form index page. google bot crawled my index page. but after 15 days also that 40 not get cachedd. All pages are interlinked with each other. i did link exchange, directory submission, submitted sitemap to googlesitemap. what else i have to do to get pages cached. kindly help me any body thanks in advance
It takes time to cache a lot of the pages and links to internal pages does wonders. Make sure the content is original and not a solid clone as google might not cache it at times.
RSS is a format for syndicating news and the content of news-like sites, including major news sites like Wired, news-oriented community sites like Slashdot, and personal weblogs. But it's not just for news. Check here for details:http://en.wikipedia.org/wiki/RSS_(protocol).