help make a robots.txt

Discussion in 'robots.txt' started by 1rapid, Jun 6, 2010.

  1. #1
    i want to make a robots.txt that are allow to:

    /
    /a/
    /b/

    and thing things in the those folder should be cached.

    but there folders like c, d ,e , and many more.

    User-agent: *
    Allow: /
    Allow: /a/
    Allow: /b/
    Disallow:/*
    Code (markup):
    does that one do the trick?
     
    1rapid, Jun 6, 2010 IP
  2. whitespparow

    whitespparow Peon

    Messages:
    464
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #2
    if you want to allow all that folders in robots.txt. don't specify just write Allow: / and if you don't want to crawl particular folder then specify it with the Disallow: /foldername.
     
    whitespparow, Jun 8, 2010 IP
  3. 1rapid

    1rapid Guest

    Messages:
    54
    Likes Received:
    0
    Best Answers:
    0
    Trophy Points:
    0
    #3
    there a lot of folders i do not want to crawl , it more than the one i want to crawl like 10 folders, the one i wrote save more space.
    will that one works?
     
    1rapid, Jun 8, 2010 IP
  4. manish.chauhan

    manish.chauhan Well-Known Member

    Messages:
    1,682
    Likes Received:
    35
    Best Answers:
    0
    Trophy Points:
    110
    #4
    If you want that your particular folders should be accessed by crawlers and rest of your website should not be accessed. You can use following code:

    User-agent: *
    Allow: /a/
    Allow: /b/
    Disallow: /

    Also, it would be better if you can specifically mention your query like what exactly you want. So that one can provide you with the exact solution.
     
    manish.chauhan, Jun 11, 2010 IP