Hi I am trying to set up a robots.txt file so that the blog within my site doesnt get listed. So far I have: Disallow: /directory/ When I run the Google robots.txt check tool it tells me that the directory is still accessible?
Thats most likely just an error on google's end. but make sure you format it like this: User-agent: * Disallow: /directory/
Hello, Here you are going to disable only the directory Disallow: /directory/ To disable a file with all the parameters is file.php? To disable only the file file.php My 2 cents Jakomo
thanks guys looks like its just the google webmaster tools that are playing up. setting robots.txt to disallow a specific directory will obviously only work for google, how do i disallow other bots? (msn, yahoo)