Hello, I would like to use a robots.txt file to "disallow" all pages where there is a dash (-) character in the URL--no matter whether they are files or folders. I created the following robots.txt file, but the robots.txt checkers that I ran it through indicated that my syntax was wrong. Can someone please help me? Thanks. User-agent: * Disallow: -
Adding a dash doesn't work; you need to explicitly specify the name of each file and/or directory that you (dis-)allow to be crawled.
Just use this : User-agent: * Disallow: /*- Code (markup): It'll work for all files, folders and sub-directories.