Hello, If so how to do that, content is not cache how to alter robots to search engine cache. for the URL: http://www.ksmwebmedia.com
Hi Prasseo, From what I'm aware of, you can not get your robots.txt to instruct search engines to cache your website pages. What it does do though, is tell search engines which pages it can and can not visit. http://en.wikipedia.org/wiki/Robots_exclusion_standard What I'd recommend is configuring your robots.txt to allow the website pages to be visited by search engines and then in time they will cache your pages. Depending on which search engine, depends on how long this takes. Make sure you submit your website URL to the search engines, and also if you set up webmaster tools on google and bing, then you can submit a sitemap which will help index your web pages faster. Hope this helps.