Don't want indexing!

Discussion in 'robots.txt' started by webmaster, Mar 16, 2006.

  1. #1
    Hi webmasters,

    I dont want crawler to read some of my site's content, Because it has been used in allmost all the pages.
    Is there any way to stop crawling of that text? I dont want to use frames.

    Please suggest.

    Thanks_
     
    webmaster, Mar 16, 2006 IP
  2. Slapyo

    Slapyo Well-Known Member

    Messages:
    266
    Likes Received:
    7
    Best Answers:
    0
    Trophy Points:
    108
    #2
    You can use a robots.txt file to tell the bots what pages not to crawl. You can also place meta tags on those pages.

    robots.txt
    User-agent: *
    Disallow: file1.html
    Disallow: file2.html
    Disallow: file3.html
    Code (markup):
    Meta tag
    <meta name="robots" content="noindex,nofollow">
    Code (markup):
     
    Slapyo, Mar 16, 2006 IP
  3. webmaster

    webmaster Active Member

    Messages:
    189
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    53
    #3
    Hi,

    I want page to index but, not the content.
    Is there any tactics.

    Please.. suggest.

    Thanks_
     
    webmaster, Mar 16, 2006 IP
  4. Slapyo

    Slapyo Well-Known Member

    Messages:
    266
    Likes Received:
    7
    Best Answers:
    0
    Trophy Points:
    108
    #4
    I think when the bot indexes the page it grabs the content. I'm not sure if it is possible to say that the bot can index the file, but not the content. Either index the file (grab the content), or don't grab it at all.
     
    Slapyo, Mar 16, 2006 IP
  5. webmaster

    webmaster Active Member

    Messages:
    189
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    53
    #5
    Thanks alot for replying, but not get my answer.
     
    webmaster, Mar 26, 2006 IP