Can I Disallow A Page

Discussion in 'robots.txt' started by ranjuse, Jul 10, 2008.

  1. #1
    Hi,,

    Can I disallow a particular page by using robotx.txt??????

    Pls provide me the code :)
     
    ranjuse, Jul 10, 2008 IP
  2. manish.chauhan

    manish.chauhan Well-Known Member

    Messages:
    1,682
    Likes Received:
    35
    Best Answers:
    0
    Trophy Points:
    110
    #2
    Yes... you can simply use this code:

    User-Agent: *
    Disallow: /your private directory or page

    For more information visit here....:)
     
    manish.chauhan, Jul 11, 2008 IP
  3. unmicdrac

    unmicdrac Guest

    Messages:
    2
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #3
    Or you can use GW to block that page from google spiders only.
     
    unmicdrac, Jul 12, 2008 IP
  4. david432111

    david432111 Peon

    Messages:
    28
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #4
    What manish said.
     
    david432111, Jul 20, 2008 IP
  5. proson

    proson Well-Known Member

    Messages:
    573
    Likes Received:
    8
    Best Answers:
    0
    Trophy Points:
    130
    #5
    or you can add <meta name=robots, content="noindex"> on your web page meta tag
     
    proson, Jul 20, 2008 IP
  6. catanich

    catanich Peon

    Messages:
    1,921
    Likes Received:
    40
    Best Answers:
    0
    Trophy Points:
    0
    #6
    The following should work for you

    User-agent: *
    Disallow: /a-directory/
    Disallow: /a-file.html
    Disallow: /a-directory/a-file.html
     
    catanich, Aug 21, 2008 IP