How can Robot.txt function

Discussion in 'robots.txt' started by dodot, Dec 26, 2008.

  1. #1
    Can anybody tell me how exactly robot.txt work? I tried wiki but still dont get it.
    I am a newbie:(
     
    dodot, Dec 26, 2008 IP
  2. ajshah.shah

    ajshah.shah Peon

    Messages:
    55
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #2
    please see www.robotstxt.org. u will get to know everything about the robots.txt file
     
    ajshah.shah, Dec 26, 2008 IP
  3. j4jango

    j4jango Peon

    Messages:
    41
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #3
    useragent:*
    disallow:/
     
    j4jango, Jan 6, 2009 IP
  4. pageloadtime

    pageloadtime Peon

    Messages:
    9
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #4
    Here is some help from google

    google.com/support/webmasters/bin/answer.py?hl=en&answer=40360
     
    pageloadtime, Jan 19, 2009 IP
  5. manish.chauhan

    manish.chauhan Well-Known Member

    Messages:
    1,682
    Likes Received:
    35
    Best Answers:
    0
    Trophy Points:
    110
    #5
    A robots.txt is a permissions file that can be used to control which webpages of a website a search engine indexes. If you have some private files or folders in your website and you don't want to show those files in search engine result pages, you can exclude those pages using robots.txt like:

    User-agent: *
    Disallow: /your private file or folder
     
    manish.chauhan, Jan 20, 2009 IP