robots.txt

Discussion in 'Search Engine Optimization' started by upendraets, Apr 28, 2011.

  1. #1
    How we make robots.txt efficiently?
     
    upendraets, Apr 28, 2011 IP
  2. Pdhailas

    Pdhailas Greenhorn

    Messages:
    26
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    11
    #2
    Robot.txt file is used to restrict the search engine bots to crawl the pages. if you want to hide something from search engine it is the best option.
    you can download robot.txt file simply from google webmaster, specify your pages which you want to restrict to crawl and download the file.
     
    Pdhailas, Apr 28, 2011 IP
  3. fetamy

    fetamy Peon

    Messages:
    50
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #3
    Use our Robots.txt generator to create a robots.txt file from this http://tools.seobook.com/robots-txt/
     
    fetamy, Apr 28, 2011 IP
  4. jitendraag

    jitendraag Notable Member

    Messages:
    3,982
    Likes Received:
    324
    Best Answers:
    1
    Trophy Points:
    270
    #4
    Robots.txt is quite efficient when it's absent or when it's blank. Every site doesn't need one.
     
    jitendraag, Apr 28, 2011 IP
  5. planetappliance

    planetappliance Peon

    Messages:
    46
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #5
    Web site owners use the /robots[dot]txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol.

    It works likes this: a robot wants to vists a Web site URL, say http ://www[dot]example[dot]com/welcome[dot]html. Before it does so, it firsts checks for http ://www[dot]example[dot]com/robots[dot]txt, and finds:

    User-agent: *
    Disallow: /

    The "User-agent: *" means this section applies to all robots. The "Disallow: /" tells the robot that it should not visit any pages on the site.
     
    planetappliance, Apr 29, 2011 IP