Hello I am useing a sharing hosting and I have 2 domains. My primacy domain is travelersmeeting.com what I notice is that Google index pages from my sub-domain thenewagetraveler.com and give to my site travelersmeeting.com, as a result Is looks like I have duplicated content.... Exable: normal page: www.thenewagetraveler.com/Thailand-Chiang-Mai..html duplicated page: www.travelersmeeting.com/thenewagetraveler/Thailand-Chiang-Mai..html I try to stop this, by add this lines to robots.txt file that located in my primacy domain under the public.html folder Disallow: /thenewagetraveler/ User-agent: Googlebot Noindex: /thenewagetraveler/ Is that correct? I just want Google index my sub domain thenewagetraveler but not crow and index the same url with-in travelersmeeting.com It will work or I have to do something more?
Since you are using 2 separate domains, you must add the robots.txt in your second domain thenewagetraveler.com as well. Just add robots.txt in thenewagetraveler.com and use following instructions there: User-agent: * Disallow: /
If I do that than google not index www.thenewagetraveler.com what I want is to seperate the 2 domains travelersmeeting primacy domain to not index pages from thenewagetraveler.com but thenewagetraveler.com can be index as well....
You have created a CONFUSING thread, first of all you need to know the meaning of a sub domain because, as per your question you have two different TLD's and not a TLD with a sub domain Some one will be able to point you in the right direction if you bother to make your question a little bit descriptive without confusing the reader
Now got your point, juts add following instructions in your robots.txt of travelersmeeting.com: User-agent: * Disallow: /thenewagetraveler/ This would restrict crawlers to visit all of the pages of thenewagetraveler on your travelersmeeting.com and would keep your thenewagetraveler.com domain well intact. Hope this would solve your problem.
I'm in a sharing hosting and I have 2 different domain under the same hosting 1 is primacy and the other is sub-domain! for: manish.chauhan I have already add in my robots.txt travelersmeeting.com: User-agent: * Disallow: /thenewagetraveler/ I just ask if this going to solve the problem... Thank you.