Just wonder, can we protect our robots.txt from being access by others? Any idea how to do that? Please let me know if you know how. Thanks
But why should you want to protect robots.txt? Is there any thing happen, if we don't protect robots.txt.
You cannot protect your robots.txt file. Any technique to hide it, and it won't be picked up by the search engines. It is important to consider that the robots.txt file is usually the starting point of website hackers. If you have admin areas that you don't want to be cached, or seen by the public, use htaccess files to password protect your directories.
It is possible (for example using .htaccess) to only allow certain IPs or hosts to see your robots.txt. But you'll have to research exactly which ones you want to allow. It's probably better to follow bavingtons advice.
You can use .htaccess to deny access to robots.txt to anyone but the search engine Bots (User Agent: Googlebot, Yahoo! Slurp).
The purpose of the "robots.txt" file is to instruct search engine web bots or spiders as to which content should be indexed and which content should be avoided. There are three important tips that can help you to gain the maximum benefits from using this file in the right way on your server.
You can do that by blocking robots.txt for all except search engine bots through htaccess. This way is not feasible as Google doesn't provide a list of its crawlers IPs. <Files robots.txt> Order Deny,Allow Deny from All Allow from *input bots ip* </Files>