Is it necessary to write a list of my all my pages in a tobots.txt

Discussion in 'robots.txt' started by ecctao, Jan 1, 2008.

  1. #1
    I wish the robot can climb all of my web,and there are no secrets in my web.
    And wish the SE can site most pages of my web.
    Does it will be better for me to list all my pagers in robots.txt?
    Or just write "*" or "all"?
     
    ecctao, Jan 1, 2008 IP
  2. r.suhag

    r.suhag Active Member

    Messages:
    35
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    91
    #2
    We write all pages in "robots.txt", but It is not necessary. This file is use, which page of our website we disallow to crawling by Search Engines "robots".
     
    r.suhag, Jan 1, 2008 IP
  3. goliathus

    goliathus Peon

    Messages:
    93
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #3
    robots.txt is mainly if you need disallow access to any crawler. Not allow. If you want to have all sites listed, create sitemap and put a link in robots.txt ...

    in this case robots.txt should content only this line:
    Sitemap: http://www.example.com/sitemap.xml
    Code (markup):
    How to create sitemap read sitemaps official site: http://www.sitemaps.org

    But it's not needed if you have normal html sitemap and/or well linked inner pages (internal links)
     
    goliathus, Jan 2, 2008 IP
  4. ecctao

    ecctao Peon

    Messages:
    187
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #4
    Thanks for all!
     
    ecctao, Jan 4, 2008 IP
  5. SwapsRulez

    SwapsRulez Peon

    Messages:
    32
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #5
    You can always go for sitemap & write only the below code in /robots.txt

    Cheers mate... Peace!
     
    SwapsRulez, Jan 12, 2008 IP