Can I use regex or something similar in robots.txt? I have a reoccuring script on hundreds of directories, except from putting the noindex meta, i want to block it's reading from robots.txt lets say this is the file structure: /x/r/edit.php /x3/r/edit.php /y1/x2/edit.php /z/66/edit.php I want to block edit.php wherever it might be, and i have hundreds of directories so writing them one by one into robots.txt will be a fuss *edit.php?