Hi webmasters, I dont want crawler to read some of my site's content, Because it has been used in allmost all the pages. Is there any way to stop crawling of that text? I dont want to use frames. Please suggest. Thanks_
You can use a robots.txt file to tell the bots what pages not to crawl. You can also place meta tags on those pages. robots.txt User-agent: * Disallow: file1.html Disallow: file2.html Disallow: file3.html Code (markup): Meta tag <meta name="robots" content="noindex,nofollow"> Code (markup):
I think when the bot indexes the page it grabs the content. I'm not sure if it is possible to say that the bot can index the file, but not the content. Either index the file (grab the content), or don't grab it at all.