Regex and robots?

Discussion in 'robots.txt' started by inManila, Jan 16, 2009.

  1. #1
    Can I use regex or something similar in robots.txt?

    I have a reoccuring script on hundreds of directories, except from putting the noindex meta, i want to block it's reading from robots.txt

    lets say this is the file structure:

    /x/r/edit.php
    /x3/r/edit.php
    /y1/x2/edit.php
    /z/66/edit.php

    I want to block edit.php wherever it might be, and i have hundreds of directories so writing them one by one into robots.txt will be a fuss

    *edit.php?
     
    inManila, Jan 16, 2009 IP
  2. manish.chauhan

    manish.chauhan Well-Known Member

    Messages:
    1,682
    Likes Received:
    35
    Best Answers:
    0
    Trophy Points:
    110
    #2
    You could block these files using:

    User-agent: *
    Disallow: /*/*/edit.php
     
    manish.chauhan, Jan 20, 2009 IP