Robots.txt validation problem in MSN webmaster

Discussion in 'robots.txt' started by e-marketing guru, Jun 23, 2008.

  1. #1
    First my Robbots.txt is

    # Alexa
    User-agent: ia_archiver
    Disallow:

    # Ask Jeeves
    User-agent: Teoma
    Disallow:

    # Google
    User-agent: googlebot
    Disallow:

    # MSN
    User-agent: msnbot
    Disallow:

    # Yahoo!
    User-agent: Slurp
    Disallow:

    # Abacho
    User-agent: AbachoBOT
    Disallow:

    # Baidu
    User-agent: baiduspider
    Disallow:

    # Fireball
    User-agent: fireball
    Disallow:

    # ObjectsSearch
    User-agent: ObjectsSearch
    Disallow:

    # Szukacz
    User-agent: szukacz
    Disallow:

    # Voila.fr
    User-agent: VoilaBot
    Disallow:

    # Walhello
    User-agent: Appie
    Disallow:

    # Yandex
    User-agent: Yandex
    Disallow:

    # Others
    User-agent: *
    Disallow:

    Validations Results: (at MSN webmaster tool)

    Line #55: Disallow:
    Error: Conflicting disallow tags found.
    **************************************************
    Warning: 'sitemap' - tag isn't specified.
    **************************************************

    Could you please help me to solve it.I am done with Canonicalization issue.So what experts here will tell me about this.is this critical to get solved.?
     
    e-marketing guru, Jun 23, 2008 IP
  2. pr0t0n

    pr0t0n Well-Known Member

    Messages:
    243
    Likes Received:
    10
    Best Answers:
    10
    Trophy Points:
    128
    #2
    I'm guessing that msn robot found two parts affecting him, and that's why it reported conflicting disallow tags found.
    The first one that is affecting msn bot is this:
    
    # MSN
    User-agent: msnbot
    Disallow:
    
    Code (markup):
    the second one is this:
    
    # Others
    User-agent: *
    Disallow:
    
    Code (markup):
    I'm not sure if every SE bot has this option to check disallow conflicts, but if they do... each SE bot will report this, since this last part with User-agent: * is affecting each SE bot already mentioned above. You should consider probably removing something from your robotos.txt
    If it is necessary to have your robots.txt to specify different disallow rules for each bot then you should remove just that "others" section. If it is not necessary to have different rules for each bot, then you can remove everything else BUT "Others" part.
    From this you pasted above, you do not have anything disallowed for any bot, so according to that you can just keep "Others" section, and remove everything else.
     
    pr0t0n, Jul 5, 2008 IP