Disallow dash in robots.txt

Discussion in 'robots.txt' started by medmarketplace, Nov 19, 2008.

  1. #1
    Hello,

    I would like to use a robots.txt file to "disallow" all pages where there is a dash (-) character in the URL--no matter whether they are files or folders. I created the following robots.txt file, but the robots.txt checkers that I ran it through indicated that my syntax was wrong. Can someone please help me? Thanks.

    User-agent: *
    Disallow: -
     
    medmarketplace, Nov 19, 2008 IP
  2. LawnchairLarry

    LawnchairLarry Well-Known Member

    Messages:
    318
    Likes Received:
    29
    Best Answers:
    0
    Trophy Points:
    118
    #2
    Adding a dash doesn't work; you need to explicitly specify the name of each file and/or directory that you (dis-)allow to be crawled.
     
    LawnchairLarry, Nov 29, 2008 IP
  3. seismic2

    seismic2 Peon

    Messages:
    27
    Likes Received:
    1
    Best Answers:
    0
    Trophy Points:
    0
    #3
    Just use this :
    User-agent: *
    Disallow: /*-
    Code (markup):
    It'll work for all files, folders and sub-directories.
     
    seismic2, Dec 9, 2008 IP
  4. manish.chauhan

    manish.chauhan Well-Known Member

    Messages:
    1,682
    Likes Received:
    35
    Best Answers:
    0
    Trophy Points:
    110
    #4
    I hope this will work..I have never tried it, but it should work..:)
     
    manish.chauhan, Jan 20, 2009 IP