Hi I have two domains point to the same hosting spaces (Like parked domains), but each one has its own URLs, I need to stop search engine spidering one of them using robot.txt file, but in this case I cant use separate robot.txt file for each of them. Any idea how about the proper way to do that using robot.txt file
The generally accepted way of dealing with parked domains is to park one on the other and use a permanent (301) redirect to the one you want to be indexed. This way it is not viewed as duplicate content. I realize some hosting configrations make that more difficult than others. I am really not sure exactly how you deal with that type of hosting. It might be as simple as adding the redirect to the .htaccess file manually.
Thanks, but I dont want to forward on to other, I want both of them to stay active for vistors but one of them for search engines