I want to disabled search engine to crawl the page such as /a_xxxxxx.html /b_xxxxxx.html This is the content of my robots file .Does it right? User-agent: * Disallow: /a_*.html Disallow: /b_*.html
This example disallows all Web Spiders for the entire site: # Make changes for all web spiders User-agent: * Disallow: /
Are you planning to block any directory like this example User-agent: * Disallow: /administrator/ Disallow: /cache/ Disallow: /images/
here is the code you you need to create User-agent: * Disallow: /a_xxxxxx.html Disallow: /b_xxxxxx.html